2019
DOI: 10.5937/univtho9-18174
|View full text |Cite
|
Sign up to set email alerts
|

Comparative performance analysis of some accelerated and hybrid accelerated gradient models

Abstract: We analyze a performance profile of several accelerated and hybrid accelerated methods. All comparative methods are at least linearly convergent and have satisfied numerical characteristics regarding tested metrics: number of iterations, CPU time and number of function evaluations. Among the chosen set of methods we numerically show which one is the most efficient and the most effective. Therewith, we derived a conclusion about what type of method is more preferable to use considering analyzed metrics.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 5 publications
0
4
0
Order By: Relevance
“…Usual way of computing this very important element is through the features of Taylor's expansion taken on the posed scheme. This way of accelerated parameter determination is confirmed as a good choice [13]. We highlight here three accelerated parameter expressions used in efficient accelerated double step size model, i.e.…”
Section: Algorithmmentioning
confidence: 69%
“…Usual way of computing this very important element is through the features of Taylor's expansion taken on the posed scheme. This way of accelerated parameter determination is confirmed as a good choice [13]. We highlight here three accelerated parameter expressions used in efficient accelerated double step size model, i.e.…”
Section: Algorithmmentioning
confidence: 69%
“…The derived function would be extended to construct the unconstrained optimization function. Based on the above discussion, it is obvious that there exist some parabolic relations between the regression parameters u 0 , u 1 , u 2 , the regression function (20) with the data x j and the value of y j . min…”
Section: Numerical Experimentsmentioning
confidence: 94%
“…Besides the CG method, the class of accelerated gradient descent schemes of Quasi-Newton type also contains very efficient and robust methods and can be considered for solving optimization problems. The accelerated parameters highlights can be seen in other studies [18][19][20][21][22]. However, in this paper we restrict the discussion to the CG method.…”
Section: Introductionmentioning
confidence: 99%
“…Here are some expressions of the accelerated factors defined in the accelerated gradient models mentioned above. These accelerated parameters are also listed in [6]:…”
Section: Accelerated Double Direction and Double Step Size Methods Overviewmentioning
confidence: 99%