2014
DOI: 10.33899/csmj.2014.163730
|View full text |Cite
|
Sign up to set email alerts
|

Conjugate Gradient Algorithm Based on Aitken's Process for Training Neural Networks

Abstract: Conjugate gradient methods constitute excellent neural network training methods, because of their simplicity, numerical efficiency and their very low memory requirements. It is wellknown that the procedure of training a neural network is highly consistent with unconstrained optimization theory and many attempts have been made to speed up this process. In particular, various algorithms motivated from numerical optimization theory have been applied for accelerating neural network training. In this paper, we prop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2015
2015
2015
2015

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 18 publications
0
3
0
Order By: Relevance
“…Abbo and Mohammed in [6] suggested a new CG algorithm NA  based on the Aitken's process, in this section we try to generalize the method to more general form known as scaled conjugate gradient methods. Consider the search direction of the form :…”
Section: 1new Scaled Cg Method(say N1scg)mentioning
confidence: 99%
See 2 more Smart Citations
“…Abbo and Mohammed in [6] suggested a new CG algorithm NA  based on the Aitken's process, in this section we try to generalize the method to more general form known as scaled conjugate gradient methods. Consider the search direction of the form :…”
Section: 1new Scaled Cg Method(say N1scg)mentioning
confidence: 99%
“…Consider the N1SCG method where the learning rate k  satisfies the standard Wolfe conditions equation (6) and 7and if…”
Section: Theorem(31)mentioning
confidence: 99%
See 1 more Smart Citation