2020
DOI: 10.1137/19m1244925
|View full text |Cite
|
Sign up to set email alerts
|

Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
41
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 36 publications
(41 citation statements)
references
References 15 publications
0
41
0
Order By: Relevance
“…We formalize this problem in Section 4 and propose an adapted H 2 -performance measure. It has already been recognized in different settings [26,2] that H 2 -performance is a measure for the robustness properties of optimization algorithms.…”
Section: Problem Formulationmentioning
confidence: 99%
See 1 more Smart Citation
“…We formalize this problem in Section 4 and propose an adapted H 2 -performance measure. It has already been recognized in different settings [26,2] that H 2 -performance is a measure for the robustness properties of optimization algorithms.…”
Section: Problem Formulationmentioning
confidence: 99%
“…Only recently, several works relying on this systems theoretic view on optimization algorithms have been published [3,17,9,38,34,23,24,10], partly also providing different approaches for convergence rate analysis using techniques from robust control. However, convergence rates are only one side of the coin; in several applications, e.g., in a data-based setting, also robustness with respect to various kinds of disturbances is a key issue and similar analysis tools have been developed [2,26]. In addition, the problem of designing algorithms specifically tailored to classes of structured optimization problems has only been touched upon so far [8,11,18].…”
Section: Introductionmentioning
confidence: 99%
“…Remark 1: The performance measure J in (6d) quantifies the steady-state variance of the iterates of first-order algorithms. Robustness of noisy algorithms can be also evaluated using alternative performance measures, e.g., the mean value of the error in the objective function [40],…”
Section: A Influence Of the Eigenvalues Of The Hessian Matrixmentioning
confidence: 99%
“…This work builds on our recent conference papers [38], [39]. In a concurrent work [40], similar approach was used to analyze the robustness of gradient descent and Nesterov's accelerated method. Therein, robustness was defined as goes to zero.…”
Section: Introductionmentioning
confidence: 99%
“…This allows SAPD to enjoy fast convergence with a robust performance in the presence of stochastic gradient noise. Achieving systematic trade-offs between the rate and robustness have been previously studied in [2] in the context of accelerated methods for smooth strongly convex minimization problems. To our knowledge, our work is the first one that can trade-off ρ with J in a systematic fashion in the context of primal-dual algorithms for solving the SP problem in (1.1).…”
mentioning
confidence: 99%