2020
DOI: 10.30534/ijeter/2020/25822020
|View full text |Cite
|
Sign up to set email alerts
|

A New Spectral Conjugate Gradient Method with Strong Wolfe-Powell Line Search

Abstract: The spectral conjugate gradient method is an efficient method for solving unconstrained optimization problems. In this paper, based on MMAR conjugate gradient method, a new spectral conjugate gradient method SMMAR is proposed with strong Wolfe-Powell line search. This method possesses sufficient descent and global convergence properties. Numerical results show that SMMAR method outperforms MMAR conjugate gradient method in terms of the number of iterations almost in all tested functions. But MMAR method outper… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 29 publications
0
2
0
Order By: Relevance
“…Hager 2,4,6,10,100 (3,3),(7,7), (9,9), (11,11 ) 15 Ex White & Holst 100,1000,10000 (5,5), (10,10), (20,20), (50,50) 6…”
Section: Applicationmentioning
confidence: 99%
See 1 more Smart Citation
“…Hager 2,4,6,10,100 (3,3),(7,7), (9,9), (11,11 ) 15 Ex White & Holst 100,1000,10000 (5,5), (10,10), (20,20), (50,50) 6…”
Section: Applicationmentioning
confidence: 99%
“…From computational point of view, the PRP formula possess better numerical results compare to the FR method and [6] proved that PRP method would converge globally when ( ) is strongly convex and the line search condition used is exact. However, the convergence of the PRP method is yet to be established under SWP condition [10]. In fact, Powell [11], [12] gave some counter examples to show that even with the exact minimization condition, there exist some functions, for which PRP method fails to converge.…”
Section: Introductionmentioning
confidence: 99%