1990
DOI: 10.1007/bf00939455
|View full text |Cite
|
Sign up to set email alerts
|

Efficient hybrid conjugate gradient techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
83
0
2

Year Published

2000
2000
2014
2014

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 215 publications
(85 citation statements)
references
References 4 publications
0
83
0
2
Order By: Relevance
“…Many authors have presented other choices for the scalar β k , for example Buckley and Lenir [2], Daniel [13], Gilbert and Nocedal [16], Qi et al [28], Shanno [29], and Touati-Ahmed and Storey [30]. Observing that the formulae (1.6)-(1.9), (1.12) and (2.1) share two numerators and three denominators, we can use combinations of these numerators and denominators to obtain the following three-parameter family:…”
Section: A Three-parameter Family Of Conjugate Gradient Methodsmentioning
confidence: 99%
“…Many authors have presented other choices for the scalar β k , for example Buckley and Lenir [2], Daniel [13], Gilbert and Nocedal [16], Qi et al [28], Shanno [29], and Touati-Ahmed and Storey [30]. Observing that the formulae (1.6)-(1.9), (1.12) and (2.1) share two numerators and three denominators, we can use combinations of these numerators and denominators to obtain the following three-parameter family:…”
Section: A Three-parameter Family Of Conjugate Gradient Methodsmentioning
confidence: 99%
“…Hybrid conjugate gradient algorithms using projections: hybrid Dai-Yuan [23], Gilbert and Nocedal [28], Hu and Storey [34], Touati-Ahmed and Storey [50], hybrid Liu and Storey [36], and hybrid conjugate gradient algorithms using the concept of convex combination of classical schemes: convex combination of Hestenes-Stiefel and Dai-Yuan with Newton direction [3,4,8], convex combination of Polak-Ribière-Polyak and DaiYuan with conjugacy condition [7]. Scaled BFGS preconditioned conjugate gradient algorithms by Shanno [47,48], Birgin and Martínez [18] and Andrei [2,9].…”
Section: Classical Conjugate Gradient Algorithmsmentioning
confidence: 99%
“…Then In the above algorithm, a back-tracking line search method is used and the step length α k to be the first element of the sequence: 1, , ..., 2 −i , ... that satisfies a sufficient decrease condition, see, e.g., Nash and Sofer [48] for details. This hybrid nonlinear conjugate gradient method was first proposed by Touati-Ahmed and Storey [61]. It has been tested to perform well for many difficult numerical examples [24] in comparison with the Polak-Ribiére-Polyak method and it does not require the line search scheme to satisfy the strong Wolfe condition.…”
Section: Numerical Solutionsmentioning
confidence: 99%