2004
DOI: 10.1023/b:coap.0000044184.25410.39
|View full text |Cite
|
Sign up to set email alerts
|

The Superlinear Convergence of a Modified BFGS-Type Method for Unconstrained Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
64
0

Year Published

2007
2007
2022
2022

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 100 publications
(64 citation statements)
references
References 12 publications
0
64
0
Order By: Relevance
“…In fact, the practical computation shows that this method is better than the normal BFGS method (see [41,42] for detail) for some given problems [30]. Furthermore, some theoretical advantages of the new quasi-Newton equation (2.6) can be seen from the following two theorems.…”
Section: 2)mentioning
confidence: 94%
See 1 more Smart Citation
“…In fact, the practical computation shows that this method is better than the normal BFGS method (see [41,42] for detail) for some given problems [30]. Furthermore, some theoretical advantages of the new quasi-Newton equation (2.6) can be seen from the following two theorems.…”
Section: 2)mentioning
confidence: 94%
“…It is not difficult to deduce that s T k y 2 * k > 0 holds for the uniformly convex function f (or see [42]). We all know that the condition s T k y 2 * k > 0 can ensure that the update matrix B k+1 from (2.8) inherits the positive definiteness of B k .…”
Section: 2)mentioning
confidence: 99%
“…Numerical results reported in [8] suggest that improvements have been achieved. Along this line, Xiao et al [19] proposed another two choices of β k by introducing two modified quasi-Newton secant equations developed in [22,18]. Numerical experiments show that their method seems to be better.…”
Section: Smooth Bound Constrained Optimizationmentioning
confidence: 99%
“…The Broyden-Fletcher-Goldfarb-Shanno (BFGS) [68] algorithm (A11) has been a classical optimization algorithm belonging to the class of approximate Newton methods. Along with several variants [66,63,35,32], BFGS has been widely used in the domain of unconstrained optimization. The Method of Moving Asymptotes (MMA) algorithm (A24) by Svanberg [57] is based on locally approximating the gradient of the objective function and a quadratic penalty term.…”
Section: (A29)mentioning
confidence: 99%