2012
DOI: 10.1007/978-3-642-30223-7_87
|View full text |Cite
|
Sign up to set email alerts
|

Brief Introduction of Back Propagation (BP) Neural Network Algorithm and Its Improvement

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
192
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 406 publications
(193 citation statements)
references
References 0 publications
1
192
0
Order By: Relevance
“…When training, the following values were used: learning rate β = 0.01 and momentum µ = 0.9, which is a commonly used momentum value for the backpropagation algorithm, see Li et al [35]. The learning rate is chosen according to the initial experiments which showed that using a smaller learning rate increases the training time and a larger value decreases the training performance.…”
Section: Resultsmentioning
confidence: 99%
“…When training, the following values were used: learning rate β = 0.01 and momentum µ = 0.9, which is a commonly used momentum value for the backpropagation algorithm, see Li et al [35]. The learning rate is chosen according to the initial experiments which showed that using a smaller learning rate increases the training time and a larger value decreases the training performance.…”
Section: Resultsmentioning
confidence: 99%
“…The back propagation (BP) [12,13] neural network algorithm is a kind of feed forward network with multiple layers, which is trained according to error back propagation algorithm BP leaning has two processes, 1) forward propagation of operating signal and 2) back propagation of error signal [10,11]. During the forward propagation process, the weight value and offset value of the network keep constant and the status of each layer of neuron affects only the next layer of neuron.…”
Section: Neural Network Predicting (Nnp) Methodsmentioning
confidence: 99%
“…In this network, there is no need to disclose in advance the mathematical equation that describes these mapping relations. Its learning rule is to adopt the steepest descent method in which the back propagation is used to regulate the weight value and threshold value of the network to achieve the minimum error sum of square 13 . The BP neural network toolbox in Matlab is chosen to get the ideal fault diagnosis results in this study.…”
Section: Bp Neural Networkmentioning
confidence: 99%