2013
DOI: 10.1137/110833786
|View full text |Cite
|
Sign up to set email alerts
|

An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods

Abstract: This paper presents an accelerated variant of the hybrid proximal extragradient (HPE) method for convex optimization, referred to as the accelerated HPE (A-HPE) framework. Iterationcomplexity results are established for the A-HPE framework, as well as a special version of it, where a large stepsize condition is imposed. Two specific implementations of the A-HPE framework are described in the context of a structured convex optimization problem whose objective function consists of the sum of a smooth convex func… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

6
164
0
12

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 111 publications
(182 citation statements)
references
References 19 publications
6
164
0
12
Order By: Relevance
“…For methods using higher-order derivatives, our lower bounds [10] for finding stationary points of non-convex functions are −(p+1)/p → −1 as the order p of smoothness grows. However, similar to Appendix A.1, the results [4,16] can show that for convex functions with Lipschitz Hessian, a second-order method achieves the strictly better rate −6/7 log 1 .…”
Section: Commentary On Our Resultssupporting
confidence: 61%
“…For methods using higher-order derivatives, our lower bounds [10] for finding stationary points of non-convex functions are −(p+1)/p → −1 as the order p of smoothness grows. However, similar to Appendix A.1, the results [4,16] can show that for convex functions with Lipschitz Hessian, a second-order method achieves the strictly better rate −6/7 log 1 .…”
Section: Commentary On Our Resultssupporting
confidence: 61%
“…Paper [52] (Lemma 1) shows that criterion (2.13) is more general than both (2.3), (2.12) and, actually, it is the combination of those error criteria. We also note that (again from Lemma 1 in [52]) the error criterion proposed in [38,55] for the approximate hybrid extragradient-proximal point algorithm corresponds to a relative version of (2.13).…”
Section: Comparison With Other Kinds Of Approximationmentioning
confidence: 91%
“…In [31,52] the classical proximal point algorithm is treated (f = 0 in (1.1)). Paper [38] considers inexact accelerated hybrid extragradient-proximal methods, but actually the framework is shown to include only the case of the exact accelerated forward-backward algorithm. In [22], convergence rates for an accelerated projected-subgradient method is proved.…”
Section: Main Contributionsmentioning
confidence: 99%
“…Several other generalizations or variations of the standard notion of subdifferential have been considered in the literature, e.g., the Clarke subdifferential [41, pp. 25-27], [42], the Fréchet and Hadamard subdifferentials [105], the G-subdifferential [76], the H-subdifferential [82], Mordukhovich's Subdifferential [85,107], Plastria's lower subdifferential [96], the Quasi-subdifferential [61], the Q-subdifferential [83], the Φ-subdifferential [92], the starsubdifferential [94], the ǫ-subdifferential [84], generalizations of the subgradient inequality such as the notion of invexity [14,64] or other notions related to convexity such as approximate convexity [47,87]. For a survey on some of these concepts see [20].…”
Section: Remark 1 Geometric Interpretationsmentioning
confidence: 99%