2008
DOI: 10.1080/00949650701496172
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating backpropagation using effective parameters at each step and an experimental evaluation

Abstract: An acceleration of backpropagation algorithm with momentum (BPM) is introduced. At every stage of the learning process, local quadratic approximation of the error function is performed and the Hessian matrix of the quadratic function is approximated. Effective learning rate and momentum factor are determined by means of maximum and minimum eigenvalues of the approximated Hessian matrix at each step. BPM algorithm is modified so as to work automatically with these effective parameters. Performance of this new a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2017
2017

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
(23 reference statements)
0
1
0
Order By: Relevance
“…An application of this approach to a popular back-propagation algorithm in neural networks can be found in [7]. In this paper, training time of a multilayer neural network had been reduced significantly in various types of benchmark problems.…”
Section: Resultsmentioning
confidence: 99%
“…An application of this approach to a popular back-propagation algorithm in neural networks can be found in [7]. In this paper, training time of a multilayer neural network had been reduced significantly in various types of benchmark problems.…”
Section: Resultsmentioning
confidence: 99%