2015
DOI: 10.1155/2015/283679
|View full text |Cite
|
Sign up to set email alerts
|

A Transformation of Accelerated Double Step Size Method for Unconstrained Optimization

Abstract: A reduction of the originally double step size iteration into the single step length scheme is derived under the proposed condition that relates two step lengths in the accelerated double step size gradient descent scheme. The proposed transformation is numerically tested. Obtained results confirm the substantial progress in comparison with the single step size accelerated gradient descent method defined in a classical way regarding all analyzed characteristics: number of iterations, CPU time, and number of fu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 14 publications
(22 citation statements)
references
References 8 publications
0
22
0
Order By: Relevance
“…A common way to determine this parameter is through the features of the second-order Taylor's series taken on appropriate scheme (6). Acceleration parameters that were computed in such way are applied in the methods described in [1][2][3][4][5]. According to the iteration form (6), we can conclude that the accelerated gradient methods are of the quasi-Newton type in which the approximation of the Hessian, i.e., its inverse, is obtained by the scalar matrix , where is appropriate identity matrix and = ( , −1 ) is the matching acceleration parameter.…”
Section: Preliminaries: Accelerated Gradient Methods and Hybrid Iteramentioning
confidence: 99%
“…A common way to determine this parameter is through the features of the second-order Taylor's series taken on appropriate scheme (6). Acceleration parameters that were computed in such way are applied in the methods described in [1][2][3][4][5]. According to the iteration form (6), we can conclude that the accelerated gradient methods are of the quasi-Newton type in which the approximation of the Hessian, i.e., its inverse, is obtained by the scalar matrix , where is appropriate identity matrix and = ( , −1 ) is the matching acceleration parameter.…”
Section: Preliminaries: Accelerated Gradient Methods and Hybrid Iteramentioning
confidence: 99%
“…where η > 0 is a constant. The methods presented in [4,6,9,10] can be classified as methods of quasi-Newton type with accelerated approximation of the Hessian inverse, equipped with the line search technique. As in [9], we refer to these methods simply as accelerated gradient descent algorithms with line search.…”
Section: Some Conjugate Gradient Methods Calculate the Vector Directimentioning
confidence: 99%
“…This so-called acceleration parameter is calculated from the second-order Taylor series of the relevant iteration at two successive points. So, unlike the model (1.5) which defined the vector direction of conjugate gradient methods, in accelerated gradient descent methods [4,6,9,10] the vector direction is the product of the negative gradient vector and a derived accelerated parameter.…”
Section: Some Conjugate Gradient Methods Calculate the Vector Directimentioning
confidence: 99%
See 1 more Smart Citation
“…There are several iterative methods, each defined in a specific way, relevant for this work. Some of them are presented in articles (Andrei, 2006), (Stanimirović et al, 2010), (Petrović et al, 2014), , (Stanimirović et. al.…”
Section: Theoretical Partmentioning
confidence: 99%