2022
DOI: 10.48550/arxiv.2203.02204
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sharper Bounds for Proximal Gradient Algorithms with Errors

Abstract: We analyse the convergence of the proximal gradient algorithm for convex composite problems in the presence of gradient and proximal computational inaccuracies. We derive new tighter deterministic and probabilistic bounds that we use to verify a simulated (MPC) and a synthetic (LASSO) optimization problems solved on a reduced-precision machine in combination with an inaccurate proximal operator. We also show how the probabilistic bounds are more robust for algorithm verification and more accurate for applicati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(10 citation statements)
references
References 24 publications
0
10
0
Order By: Relevance
“…Although our global convergence is guaranteed by the inexact PG step, existing analyses for inexact PG that utilizes the geometry of the iterates like those in [15,47,24,21] are not applicable, due to that the additional MF phase could move the iterates arbitrarily in the level set. Therefore, another contribution of this work is developing new proof techniques for obtaining global convergence guarantee for general nonmonotone inexact PG combined with other optimization steps.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Although our global convergence is guaranteed by the inexact PG step, existing analyses for inexact PG that utilizes the geometry of the iterates like those in [15,47,24,21] are not applicable, due to that the additional MF phase could move the iterates arbitrarily in the level set. Therefore, another contribution of this work is developing new proof techniques for obtaining global convergence guarantee for general nonmonotone inexact PG combined with other optimization steps.…”
Section: Discussionmentioning
confidence: 99%
“…This feature combines with the MF phase makes the analysis difficult. Existing analyses for inexact PG [15,47,24,21] utilize telescope sums of inequalities in the form of…”
Section: Convergence Of Inexact Methodsmentioning
confidence: 99%
“…Contributions. We improve upon our previous work [22,24] and establish convergence bounds on the objective function values of approximate proximal-gradient descent (AxPGD), approximate accelerated proximal-gradient descent (AxAPGD) and the generalized Proximal ADMM (AxWLM-ADMM) schemes. We consider approximate gradient and approximate proximal computations with errors that manifest light-tailed rare extreme events behaviour and quantify their contribution to the final residual suboptimality in the objective function value.…”
Section: Applicationsmentioning
confidence: 92%
“…In our previous work [24], we derived sharper probabilistic bounds that do not depend on the iteration counter k, which makes them more realistic assertions for testing convergence. The bounds that we derived are given below for the basic approximate proximal-gradient descent (2.3)…”
Section: Related Work the Work Inmentioning
confidence: 99%
See 1 more Smart Citation