2016
DOI: 10.1186/s13660-016-1049-5
|View full text |Cite
|
Sign up to set email alerts
|

An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property

Abstract: The conjugate gradient (CG) method is one of the most popular methods to solve nonlinear unconstrained optimization problems. The Hestenes-Stiefel (HS) CG formula is considered one of the most efficient methods developed in this century. In addition, the HS coefficient is related to the conjugacy condition regardless of the line search method used. However, the HS parameter may not satisfy the global convergence properties of the CG method with the Wolfe-Powell line search if the descent condition is not satis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
14
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 15 publications
(14 citation statements)
references
References 14 publications
0
14
0
Order By: Relevance
“…The preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization is summarized in Algorithm 4. (∇F (x j+1 )) Solve P j+1 y j+1 = g j+1 Compute search direction d j+1 = −y j+1 + β * ZA d j end for Algorithm 4 is the Hestenes-Stiefel conjugate gradient method with restarts [39] according to…”
Section: Micromagnetics Energy Minimizationmentioning
confidence: 99%
“…The preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization is summarized in Algorithm 4. (∇F (x j+1 )) Solve P j+1 y j+1 = g j+1 Compute search direction d j+1 = −y j+1 + β * ZA d j end for Algorithm 4 is the Hestenes-Stiefel conjugate gradient method with restarts [39] according to…”
Section: Micromagnetics Energy Minimizationmentioning
confidence: 99%
“…The new scale parameter φ satisfies the sufficient descent condition, global convergence analysis proved under Strong Wolfe line search conditions. Our numerical results show that the proposed method is effective and robust against some known algorithms.search, known as the line searches [2]. Among them, the so-called strong wolf line search conditions require that [3] [4].…”
mentioning
confidence: 97%
“…search, known as the line searches [2]. Among them, the so-called strong wolf line search conditions require that [3] [4].…”
mentioning
confidence: 99%
“…Therefore, PRP method is the most efficient method when it is compared to the other conjugate gradient methods. For more, the reader can see the following references [14][15][16][17][18][19].…”
Section: Abstract and Applied Analysismentioning
confidence: 99%