2008
DOI: 10.1016/j.cam.2007.08.017
|View full text |Cite
|
Sign up to set email alerts
|

The global and superlinear convergence of a new nonmonotone MBFGS algorithm on convex objective functions

Abstract: In this paper, a new nonmonotone MBFGS algorithm for unconstrained optimization will be proposed. Under some suitable assumptions, the global and superlinear convergence of the new nonmonotone MBFGS algorithm on convex objective functions will be established. Some numerical experiments show that this new nonmonotone MBFGS algorithm is competitive to the MBFGS algorithm and the nonmonotone BFGS algorithm.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 13 publications
0
7
0
Order By: Relevance
“…We also tested our method on a large number of problems from CUTEr library, which indicates our method is promising. Lately, [16] has already proven the superlinear convergence when using Wolfe condition. Although we all suppose that superlinear convergence holds as for the Armijo line search condition, at present we cannot give a complete proof of it yet.…”
Section: Discussionmentioning
confidence: 98%
See 2 more Smart Citations
“…We also tested our method on a large number of problems from CUTEr library, which indicates our method is promising. Lately, [16] has already proven the superlinear convergence when using Wolfe condition. Although we all suppose that superlinear convergence holds as for the Armijo line search condition, at present we cannot give a complete proof of it yet.…”
Section: Discussionmentioning
confidence: 98%
“…Their numerical experiments showed that this method was competitive to the standard BFGS algorithm. More recently, Liu, Yao, and Wei (see [16]) also introduced a modified nonmonotone BFGS algorithm on the basis of a modified secant condition. Unfortunately, the global convergence result of this method depended on the convex assumption of the objective functions.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation
“…Proof From the proof process of the Lemma 3.1 in [17], we can get the result. Lemma 3 Let the sequence {x k } be generated by Algorithm 1.…”
Section: Global Convergencementioning
confidence: 97%
“…The nonmonotone BFGS method was first studied by Liu, et al in [15]. Subsequently, two other nonmonotone BFGS methods were proposed for solving problem (1) in [12,16]. Note that convergence analysis in all these algorithms was proved under convex assumption on the objective function.…”
Section: Introductionmentioning
confidence: 99%