2021
DOI: 10.1186/s13660-021-02675-y
|View full text |Cite
|
Sign up to set email alerts
|

On convergence and complexity analysis of an accelerated forward–backward algorithm with linesearch technique for convex minimization problems and applications to data prediction and classification

Abstract: In this work, we introduce a new accelerated algorithm using a linesearch technique for solving convex minimization problems in the form of a summation of two lower semicontinuous convex functions. A weak convergence of the proposed algorithm is given without assuming the Lipschitz continuity on the gradient of the objective function. Moreover, the convexity of this algorithm is also analyzed. Some numerical experiments in machine learning are also discussed, namely regression and classification problems. Furt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 26 publications
(39 reference statements)
0
3
0
Order By: Relevance
“…where N is a number of iterations that we want to stop and σ n = σ n for algorithm (6). Sigmoid is set as an activation function with hidden nodes M = 160, and four evaluation metrics of each algorithm are shown in Table 4.…”
Section: Application To Data Classification Problemmentioning
confidence: 99%
See 2 more Smart Citations
“…where N is a number of iterations that we want to stop and σ n = σ n for algorithm (6). Sigmoid is set as an activation function with hidden nodes M = 160, and four evaluation metrics of each algorithm are shown in Table 4.…”
Section: Application To Data Classification Problemmentioning
confidence: 99%
“…The weak convergence of algorithm fω n g generated by (6) was proved under the conditions of the extrapolation factor (8) and the stepsize parameter λ.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, we could utilize an inertial step, which was first introduced by Polyak [13], to solve smooth convex minimization problems. Since then, there have been several works that included an inertial step in their algorithms to accelerate the convergence behavior; see [14][15][16][17][18][19] for examples.…”
Section: Introductionmentioning
confidence: 99%