2014
DOI: 10.1007/s11590-014-0795-x
|View full text |Cite
|
Sign up to set email alerts
|

Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization

Abstract: A great deal of interest of solving large-scale convex optimization problems has, recently, turned to gradient method and its variants. To ensure rates of linear convergence, current theory regularly assumes that the objective functions are strongly convex. This paper goes beyond the traditional wisdom by studying a strictly weaker concept than the strong convexity, named restricted strongly convexity which was recently proposed and can be satisfied by a much broader class of functions. Utilizing the restricte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 36 publications
(37 citation statements)
references
References 15 publications
0
36
0
Order By: Relevance
“…Those sorts of relaxations were further exploited in [6,7] (relaxation of the strong convexity requirement, with motivational examples). We leave further investigations in that direction for future research.…”
Section: Upper Bounds On the Global Convergence Ratesmentioning
confidence: 99%
See 1 more Smart Citation
“…Those sorts of relaxations were further exploited in [6,7] (relaxation of the strong convexity requirement, with motivational examples). We leave further investigations in that direction for future research.…”
Section: Upper Bounds On the Global Convergence Ratesmentioning
confidence: 99%
“…Other related research trends feature linear convergence rates for the (proximal) gradient method under weaker assumptions than strong convexity, and linear convergence rates under inexact first-order information [5]. Among others, restricted strong convexity-type results are presented in [6,7], and convergence under the Polyak-Lojasiewicz condition were very recently presented in [8].…”
Section: Introductionmentioning
confidence: 99%
“…We finally utilize the following lemma to obtain an alternative form of the restricted strong convexity condition [69].…”
Section: Proof Of the Regularity Conditionmentioning
confidence: 99%
“…In the next theorem, we show that (2.5) is also a sufficient condition for linear convergence (with σ = 0) of the stochastic gradient method for the class of restricted strongly convex function f . Restricted strong convexity is much weaker than strong convexity, some examples and properties of restricted strongly convex functions can be found in [17]. Note that if f is a strongly convex function, A is a linear mapping, then the composite function f • A is restricted strongly convex.…”
Section: )mentioning
confidence: 99%