Figure 1

Figure 2

Figure 3

Figure 4

Figure 5

Figure 6

Comparison of contribution points of representative work and citing sentences_
| Contribution points mentioned by the author in the original representative work | We describe a new learning procedure, back-propagation, for networks of neurone-like units. |
|---|---|
| Contribution points mentioned in the citing sentences | 1. In MBGD, the learning rate is very important to the convergence speed and quality in training. Many different schemes, e.g., momentum [6], averaging [15], AdaGrad [16], RMSProp [17], Adam [18], etc., have been proposed to optimize the learning rate in neural network training. Adam may be the most popular one among them. [43.84%] |
| 2. As for the extrapolation, a smooth activation function that only acts on the hidden layer(s) is recommended. Back-propagation is the second part of the algorithm [37]. This is the central mechanism that allows neural network methods to “learn.” [42.36%] | |
| 3. The feedforward multilayer perceptron is one of the most popular types of ANNs; it was developed by Rumelhart et al. [23], and it is presented in Supplementary 1. This network also consists of an input layer, one or more hidden layers, and one output layer. [10.95%] |
Highlights of citing sentences praising the contribution point as a breakthrough_
| Highlights of Citing Sentence | Titles of Citing Paper | |
|---|---|---|
| 1 | The steepest descent algorithm, also known as the error backpropagation (EBP) algorithm [8,9], dispersed the dark clouds on the field of artificial neural networks and could be regarded as one of the most significant breakthroughs for training neural networks. | Application of Neural Networks to Automatic Load Frequency Control |
| 2 | In addition to the development of new ANN algorithms that were more neural-inspired (e.g. Hopfield networks), another major breakthrough that helped lead to a resurgence in neural network research was the rediscovery of the backpropagation technique (LeCun, 1985, Rumelhart et al., 1986, Werbos, 1990). | A historical survey of algorithms and hardware architectures for neural-inspired and neuromorphic computing applications |
| 3 | The next major breakthrough happened in late 80s with the invention of back-propagation and a gradient-based optimization algorithm to train a neural network with one or two hidden layers with any desired number of nodes (Rumelhart et al., 1986). | Meta-analysis of deep neural networks in remote sensing: A comparative study of mono-temporal classification to support vector machines |
Representative papers of Geoffrey Hinton_
| # | Author | Representative Papers | Citing Papers |
|---|---|---|---|
| 1 | Hinton Geoffrey | Deep learning | 7,630 |
| 2 | Hinton Geoffrey | Reducing the dimensionality of data with neural networks | 4,509 |
| 3 | Hinton Geoffrey | A fast learning algorithm for deep belief nets | 4,538 |
| 4 | Hinton Geoffrey | Learning representations by back propagating errors | 6,598 |
| 5 | Hinton Geoffrey | Dropout: A Simple Way to Prevent Neural Networks from | 4,142 |
| Overfitting |
Sentiment categorization of citing sentences_
| Sentiment Category | Definition | Sentiment Score |
|---|---|---|
| Range (E) | ||
| Positive | Holding commendatory, approving and admiring attitude | 1>E>0 |
| Neutral | Brief statement or rephrasing, without obvious expression of sentiment | E=0 |
| Negative | Describing defects, shortcomings or mistakes | 0>E>-1 |
Comparison between CiteOpinion and Conventional Quantitative Analysis Tools_
| Quantitative Analysis Tool | CiteOpinion | |
|---|---|---|
| Data object | Metadata of paper | Citing sentences |
| Data granularity | Article | Sentence |
| Analysis focus | Statistical indicators | Text content mining |
| Result form | Relationship diagrams and data sheets | Evaluation evidence text, relationship diagrams and data sheets |