2010
DOI: 10.1137/070711712
|View full text |Cite
|
Sign up to set email alerts
|

Incremental Subgradients for Constrained Convex Optimization: A Unified Framework and New Methods

Abstract: We present a unifying framework for nonsmooth convex minimization bringing together-subgradient algorithms and methods for the convex feasibility problem. This development is a natural step for-subgradient methods in the direction of constrained optimization since the Euclidean projection frequently required in such methods is replaced by an approximate projection, which is often easier to compute. The developments are applied to incremental subgradient methods, resulting in new algorithms suitable to large-sc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
62
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(62 citation statements)
references
References 46 publications
(53 reference statements)
0
62
0
Order By: Relevance
“…For example, in Ref. 37 it is assumed (just as in the theory presented in our Ref. 29) that all the constraints can be satisfied simultaneously.…”
Section: Introductionmentioning
confidence: 99%
“…For example, in Ref. 37 it is assumed (just as in the theory presented in our Ref. 29) that all the constraints can be satisfied simultaneously.…”
Section: Introductionmentioning
confidence: 99%
“…-For convex optimization problems and under a bounded subgradient assumption, condition (H1) with p = 1 and (H2) are satisfied for the subgradient-type methods, including the standard subgradient method [43], the approximate subgradient method [25], the primal-dual subgradient method [34], the incremental subgradient method [33], the conditional subgradient method [28] and a unified framework of subgradient methods [37]; -For quasi-convex optimization problems and under the assumption of Hölder condition of order p, conditions (H1) and (H2) are satisfied for several types of subgradient methods, such as the standard subgradient method [24], the inexact subgradient method [19], the primal-dual subgradient method [20] and the conditional subgradient method [21].…”
Section: A Unified Framework For Subgradient Methodsmentioning
confidence: 99%
“…They are Lagrangian relaxation and subgradient optimization [2], [3], [5], [11], Lagrangian relaxation and surrogate subgradient optimization [6], [7], [16], and branch-and-cut [4], [12] and [14].…”
Section: Literature Reviewmentioning
confidence: 99%
“…It has been shown that the gradient method converges with constant step size for differentiable problems under the Lipschitz condition [5] and that the subgradient method converges for the non-differentiable programming problems with diminishing and dynamic step size [11]. Perhaps one of the most recent and exhaustive surveys on the subgradient methods for convex optimization is [2].…”
Section: Literature Reviewmentioning
confidence: 99%