1964
DOI: 10.1093/comjnl/7.2.149
|View full text |Cite
|
Sign up to set email alerts
|

Function minimization by conjugate gradients

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
1,753
0
31

Year Published

1996
1996
2010
2010

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 4,203 publications
(1,788 citation statements)
references
References 0 publications
4
1,753
0
31
Order By: Relevance
“…Fletcher and Reeves generalized the procedure to nonquadratic functions yielding the non-linear conjugate gradients algorithm [13]. Here, only the function f (x) and its gradient ∇f (x) are available.…”
Section: The Linear and Non-linear Conjugate Gradient Algorithmsmentioning
confidence: 99%
“…Fletcher and Reeves generalized the procedure to nonquadratic functions yielding the non-linear conjugate gradients algorithm [13]. Here, only the function f (x) and its gradient ∇f (x) are available.…”
Section: The Linear and Non-linear Conjugate Gradient Algorithmsmentioning
confidence: 99%
“…We propose an algorithm that differs from it in three ways. Firstly, the heuristic updates are replaced by a standard conjugate gradient algorithm [11]. Secondly, the linearisation method from [7] is applied.…”
Section: Variational Bayesian Methodsmentioning
confidence: 99%
“…In Theorem 4.3 of Smith (1993), the following result is stated for the Riemannian Fletcher & Reeves (1964) and Polak & Ribière (1969) conjugate gradient methods. SupposeŶ is a non-degenerate stationary point such that the Hessian atŶ is positive definite.…”
Section: Global Convergencementioning
confidence: 99%
“…Our implementation of geometric optimisation over low-rank correlation matrices 'LRCM min' 5 is an adoption of the 'SG min' template of Edelman & Lippert (2000) (written in MATLAB) for optimisation over the Stiefel and Grassmann manifolds. This template contains four distinct well-known non-linear optimisation algorithms adapted for geometric optimisation over Riemannian manifolds: Newton algorithm; dogleg step or Levenberg (1944) and Marquardt (1963) algorithm;Polak & Ribière (1969) conjugate gradient;and Fletcher & Reeves (1964) conjugate gradient.…”
Section: Where D * Can Be Obtained By Selecting At Most D Nonnegativementioning
confidence: 99%