2008
DOI: 10.3934/jimo.2008.4.299
|View full text |Cite
|
Sign up to set email alerts
|

New adaptive stepsize selections in gradient methods

Abstract: This paper deals with gradient methods for minimizing n-dimensional strictly convex quadratic functions. Two new adaptive stepsize selection rules are presented and some key properties are proved. Practical insights on the effectiveness of the proposed techniques are given by a numerical comparison with the Barzilai-Borwein (BB) method, the cyclic/adaptive BB methods and two recent monotone gradient methods.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
150
0
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 126 publications
(157 citation statements)
references
References 21 publications
(37 reference statements)
1
150
0
2
Order By: Relevance
“…In other words, the BB stepsize will approximate all the reciprocals of eigenvalues during the iteration process. Similar observations have been presented in [23]. Thus, for quadratic problem (1) we may consider the following two variants of (18), which Algorithm 1 Gradient method for bound constrained minimization 1: Initialization:…”
Section: (27)mentioning
confidence: 89%
See 4 more Smart Citations
“…In other words, the BB stepsize will approximate all the reciprocals of eigenvalues during the iteration process. Similar observations have been presented in [23]. Thus, for quadratic problem (1) we may consider the following two variants of (18), which Algorithm 1 Gradient method for bound constrained minimization 1: Initialization:…”
Section: (27)mentioning
confidence: 89%
“…Two variants of gradient projection methods combining with the BB stepsizes are also proposed. Our numerical comparisons with DY (6), ABB min 2 [23] and SDC (7) methods on minimizing quadratic functions indicate the proposed strategies and methods are very effective. Moreover, our numerical comparisons with the spectral projected gradient (SPG) method [3,4] on solving bound constrained optimization problems from the CUTEst collection [26] also highly suggest the potential benefits of extending the strategies and methods in the paper for more general large-scale bound constrained optimization.…”
Section: Introductionmentioning
confidence: 93%
See 3 more Smart Citations