2015
DOI: 10.1016/j.jco.2014.08.003
|View full text |Cite
|
Sign up to set email alerts
|

On lower complexity bounds for large-scale smooth convex optimization

Abstract: We derive lower bounds on the black-box oracle complexity of large-scale smooth convex minimization problems, with emphasis on minimizing smooth (with Hölder continuous, with a given exponent and constant, gradient) convex functions over high-dimensional · p -balls, 1 ≤ p ≤ ∞. Our bounds turn out to be tight (up to logarithmic in the design dimension factors), and can be viewed as a substantial extension of the existing lower complexity bounds for large-scale convex minimization covering the nonsmooth case and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

5
67
0
2

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 46 publications
(74 citation statements)
references
References 14 publications
5
67
0
2
Order By: Relevance
“…When Q is an p ball, with 1 ≤ p ≤ 2, we show that this complexity bound matches lower complexity bounds from Guzmán and Nemirovski (2015) for the large-scale regime (more precisely, n = Ω(1/ε 2 )).…”
supporting
confidence: 64%
See 3 more Smart Citations
“…When Q is an p ball, with 1 ≤ p ≤ 2, we show that this complexity bound matches lower complexity bounds from Guzmán and Nemirovski (2015) for the large-scale regime (more precisely, n = Ω(1/ε 2 )).…”
supporting
confidence: 64%
“…Before doing this, it is worth mentioning that this optimality only holds for large-scale problems, namely where dimension n is larger than the number of iterations T : if one can afford a superlinear (in dimension) number of iterations, methods such as the center of gravity or ellipsoid can achieve better complexity estimates (Nemirovskiǐ and Yudin, 1979). It was proved in Guzmán and Nemirovski (2015) that the class of problems (16), where f is convex and has L p -Lipschitz continuous gradient w.r.t. · p , satisfies the following lower bound on minimax risk:…”
Section: Choosing the Proxmentioning
confidence: 99%
See 2 more Smart Citations
“…On the other hand, establishing lower bounds on the minimax risk requires a more involved analysis, as the bound needs to hold for any first-order method. Several approaches appear in the literature for establishing lower-bounds, including resisting oracles [9], construction of a "worst-case" function [5,10], and reduction to statistical problems [1,11,13].…”
Section: Introductionmentioning
confidence: 99%