2017
DOI: 10.1007/s11075-017-0365-2
|View full text |Cite
|
Sign up to set email alerts
|

An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 30 publications
(18 citation statements)
references
References 24 publications
0
16
0
Order By: Relevance
“…Liu et al [19] proposed GM_AOS 1, GM_AOS 2 and GM_AOS 3 algorithms, and GM_AOS 2 algorithm was slightly better than the other algorithms. When the quadratic model is considered, the algorithm developed by [18] is identical with GM_AOS 1 algorithm. In a certain sense, our algorithm can be viewed as an extension of SCG algorithm [5] and a modification of DY algorithm [8].…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Liu et al [19] proposed GM_AOS 1, GM_AOS 2 and GM_AOS 3 algorithms, and GM_AOS 2 algorithm was slightly better than the other algorithms. When the quadratic model is considered, the algorithm developed by [18] is identical with GM_AOS 1 algorithm. In a certain sense, our algorithm can be viewed as an extension of SCG algorithm [5] and a modification of DY algorithm [8].…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Recently, Liu et al [18,19] introduced approximate optimal stepsizes (α AOS k ) for gradient method. They constructed a quadratic approximation model of…”
Section: Introductionmentioning
confidence: 99%
“…In [53], a new concept of the approximate optimal step size for gradient method is introduced and used to interpret the BB method; an efficient gradient method with the approximate optimal step size for unconstrained optimization is presented. The next definition is introduced in [ The approximate optimal step size is different from the steepest descent step size, which will lead to the expensive computational cost.…”
Section: Algorithm 125 (Barzilai-borwein Gradient Method Ie Bbmentioning
confidence: 99%
“…We see from the definition of approximately optimal stepsize that the numerical performance of gradient method with approximately optimal stepsize depends heavily on approximation model φ k (α). Some gradient methods with approximately optimal stepsizes [31][32][33][34] were later proposed for unconstrained optimization, and the numerical results in [31][32][33][34] suggested that these gradient methods with approximately optimal stepsizes are surprisingly efficient.…”
Section: Introductionmentioning
confidence: 99%
“…In those gradient methods with approximately optimal stepsizes [31][32][33][34], the gradient method with approximately optimal stepsizes based on conic model [31] has enjoyed some attentions [35] due to its good nice numerical performance. In this paper, we present an improved gradient method with approximately optimal optimal optimal stepsize based on conic model for unconstrained optimization.…”
Section: Introductionmentioning
confidence: 99%