2021
DOI: 10.48550/arxiv.2105.13271
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

OpReg-Boost: Learning to Accelerate Online Algorithms with Operator Regression

Abstract: This paper presents a new regularization approach -termed OpReg-Boost -to boost the convergence and lessen the asymptotic error of online optimization and learning algorithms. In particular, the paper considers online algorithms for optimization problems with a time-varying (weakly) convex composite cost. For a given online algorithm, OpReg-Boost learns the closest algorithmic map that yields linear convergence; to this end, the learning procedure hinges on the concept of operator regression. We show how to fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 42 publications
(73 reference statements)
0
1
0
Order By: Relevance
“…(Li and Malik, 2016;Finn et al, 2017;Wichrowska et al, 2017;Andrychowicz et al, 2016;Metz et al, 2019Metz et al, , 2021Gregor and LeCun, 2010), focuses on learning better solutions to parameter learning problems that arise for machine learning tasks. Bastianello et al (2021) approximates the fixed-point iteration with the closest contractive fixed-point iteration.…”
Section: Related Workmentioning
confidence: 99%
“…(Li and Malik, 2016;Finn et al, 2017;Wichrowska et al, 2017;Andrychowicz et al, 2016;Metz et al, 2019Metz et al, , 2021Gregor and LeCun, 2010), focuses on learning better solutions to parameter learning problems that arise for machine learning tasks. Bastianello et al (2021) approximates the fixed-point iteration with the closest contractive fixed-point iteration.…”
Section: Related Workmentioning
confidence: 99%