2009 IITA International Conference on Control, Automation and Systems Engineering (Case 2009) 2009
DOI: 10.1109/case.2009.146
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Learning Algorithm of Back-Propagation Neural Network

Abstract: Standard neural network based on backpropagation learning algorithm has some faults, such as low learning rate, instability, and long learning time. In this paper, we introduce trust-field method and bring forward a new learning factor, meanwhile we adopt Quasic-Newton algorithm to replace gradient descent algorithm. Three algorithms are utilized in the novel back-propagation neural network. Thus the neural network avoids the local minimum problem, improves the stability and reduces the training time and test … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 5 publications
0
5
0
Order By: Relevance
“…After several iterations of the training and validation phases, the model performance is tested on the test set. Many machine learning algorithms exist, namely, SVM, KNNs, ANNs [ 33 ], feed-forward neural networks [ 34 ], and BPNNs [ 35 ], all of which have proven their superior ability to classify biomedical images, such as the MRI images of brain tumours.…”
Section: Overviewmentioning
confidence: 99%
“…After several iterations of the training and validation phases, the model performance is tested on the test set. Many machine learning algorithms exist, namely, SVM, KNNs, ANNs [ 33 ], feed-forward neural networks [ 34 ], and BPNNs [ 35 ], all of which have proven their superior ability to classify biomedical images, such as the MRI images of brain tumours.…”
Section: Overviewmentioning
confidence: 99%
“…Many studies used the adaptive training rate by adopting the monotonicity function such as [7] - [18] used the exponential to increase the speed of the BP algorithm. Based on the discussion above, the new formula of the training rate is proposed as follows.…”
Section: Himentioning
confidence: 99%
“…Many recent studies have attempted to solve the slow training required for the SBP algorithm through adaptation of parameters such as the training rate, which is controlled by the weight adjustment along with the descent direction [6]. Gong [7] proposed a novel algorithm of the neural network NBPNN based on a self-adaptive learning factor. Those algorithms were tested on XOR 2-bit or two dimensionally.…”
Section: Introductionmentioning
confidence: 99%
“…The back propagation algorithm for the Iris data problem uses the generalized delta rule including the momentum term to ensure better performance [13], [14], [15].…”
Section: Neural Back Propagationmentioning
confidence: 99%