2017
DOI: 10.1186/s13660-017-1453-5
|View full text |Cite
|
Sign up to set email alerts
|

A modified nonmonotone BFGS algorithm for unconstrained optimization

Abstract: In this paper, a modified BFGS algorithm is proposed for unconstrained optimization. The proposed algorithm has the following properties: (i) a nonmonotone line search technique is used to obtain the step size to improve the effectiveness of the algorithm; (ii) the algorithm possesses not only global convergence but also superlinear convergence for generally convex functions; (iii) the algorithm produces better numerical results than those of the normal BFGS method.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 63 publications
0
3
0
Order By: Relevance
“…rough this formula, the researchers proved that the modified symmetric matrix is positive definite [10]. Case 2: if s T k y k > 0, in this case, we can say surely that the BFGS update matrix is symmetric and positive definite when applied within this formula (in other words, when applying the inequality s T k y k > 0 in the formula h k , max � 0) [11].…”
Section: A New Scalar Formula For the Parameter θ New Kmentioning
confidence: 97%
“…rough this formula, the researchers proved that the modified symmetric matrix is positive definite [10]. Case 2: if s T k y k > 0, in this case, we can say surely that the BFGS update matrix is symmetric and positive definite when applied within this formula (in other words, when applying the inequality s T k y k > 0 in the formula h k , max � 0) [11].…”
Section: A New Scalar Formula For the Parameter θ New Kmentioning
confidence: 97%
“…Unconstrained optimization is the aim of many papers, and it has a variety of applications (see, for example, [1][2][3][4][5]). Nevertheless, the existing numerical methods for solving the general unconstrained optimization problem up to the second order have a very low convergence rate in the case of degenerate problems [6][7][8][9][10][11][12][13][14][15][16][17] since for increasing the convergence rate, it is necessary to use derivatives of orders greater than two [6,7]. At the same time, using derivatives of the 3 rd and 4 th orders makes a numerical method very time-consuming.…”
Section: Introductionmentioning
confidence: 99%
“…However, the method converges for only uniformly convex functions. Li et al [25] proposed a new BFGS algorithm with modified secant equation which achieves both global and superlinear convergence for generally convex functions under the nonmonotone line search of [19]. Su and Rong [26] introduced and established a new spectral CG method and its implementation under a modified nonmonotone line search technique.…”
Section: Introductionmentioning
confidence: 99%