1978
DOI: 10.1007/bfb0067703
|View full text |Cite
|
Sign up to set email alerts
|

A fast algorithm for nonlinearly constrained optimization calculations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
616
0
17

Year Published

1991
1991
2017
2017

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 1,154 publications
(634 citation statements)
references
References 7 publications
1
616
0
17
Order By: Relevance
“…IfĤ k−1 is positive semidefinite, the BFGS update with the above condition can be suitably damped such thatĤ k is also positive semidefinite; see, e.g., [29]. At each iteration of the SSP method, problem (30) is solved with H k replaced by the matrixĤ k that is obtained by the BFGS update ofĤ k−1 from the previous SSP iteration.…”
Section: An Ssp Methods For Nonlinear Semidefinite Programsmentioning
confidence: 99%
See 1 more Smart Citation
“…IfĤ k−1 is positive semidefinite, the BFGS update with the above condition can be suitably damped such thatĤ k is also positive semidefinite; see, e.g., [29]. At each iteration of the SSP method, problem (30) is solved with H k replaced by the matrixĤ k that is obtained by the BFGS update ofĤ k−1 from the previous SSP iteration.…”
Section: An Ssp Methods For Nonlinear Semidefinite Programsmentioning
confidence: 99%
“…A proof of convergence for such modifications is the subject of current research; see, e.g., [10]. Since all the data enters in a continuous fashion in the preceding analysis, it follows that the SSP method with step size one is still locally superlinearly convergent if the matrices H k in (29) are replaced by approximationsĤ k with H k −Ĥ k → 0.…”
Section: Theorem 3 Assume That the Functionmentioning
confidence: 99%
“…functional gradients are calculated numerically. The routine is based on the version of the conjugate gradient algorithm described in Powell (1977Powell ( , 1978. The main advantage of the conjugate gradient technique is that it provides a fast rate of convergence without the storage of any matrices.…”
Section: Optimizationmentioning
confidence: 99%
“…Hence, the objective fimetion (48) is an approximation of the original objective funetion (38). The equality constraints (49) are the collocation conditions (47) . The inequalities (50) and the right inequality of (51) reflect the constraints (40) and (41) of the original problem.…”
Section: Breinbauer and P Lorymentioning
confidence: 99%