2020
DOI: 10.1137/19m130769x
|View full text |Cite
|
Sign up to set email alerts
|

Contracting Proximal Methods for Smooth Convex Optimization

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
18
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 19 publications
(18 citation statements)
references
References 14 publications
0
18
0
Order By: Relevance
“…This step is optional. Moreover, looking at algorithm (6), one may think that we are forgetting the points xk+1 when the function value is increasing:…”
Section: Conceptual Contracting − Point Method Imentioning
confidence: 99%
See 3 more Smart Citations
“…This step is optional. Moreover, looking at algorithm (6), one may think that we are forgetting the points xk+1 when the function value is increasing:…”
Section: Conceptual Contracting − Point Method Imentioning
confidence: 99%
“…Motivation In the last years, we can see an increasing interest in new frameworks for derivation and justification of different methods for Convex Optimization, provided with a worst-case complexity analysis (see, for example, [3,4,6,11,14,15,18,[20][21][22]). It appears that the accelerated proximal tensor methods [2,20] can be naturally explained through the framework of high-order proximal-point schemes [21] requiring solution of nontrivial auxiliary problem at every iteration.…”
Section: Mathematics Subject Classification 90c25 • 90c06 • 65k05 1 Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…In [3], it is proved that the convergence rate of the objective function values is o(k −2 ), and the iterates {x k } converge to a minimizer of this problem as α > 3. Recently, Doikov and Nesterov [18] presented a new accelerated algorithm for solving such problem, in which they used the high-order tensor methods to solve the inner subproblems and gave a complexity estimate under some assumptions. For the case with Lipschitz continuously differentiable but nonconvex loss function, Wen, Chen and Pong [35] proved that the iterates and objective function values generated by the proximal gradient algorithm with extrapolation are R-linearly convergent under the error bound condition.…”
Section: Introductionmentioning
confidence: 99%