2003
DOI: 10.1590/s0101-82052003000100003
|View full text |Cite
|
Sign up to set email alerts
|

On the convergence properties of the projected gradient method for convex optimization

Abstract: Abstract. When applied to an unconstrained minimization problem with a convex objective, the steepest descent method has stronger convergence properties than in the noncovex case: the whole sequence converges to an optimal solution under the only hypothesis of existence of minimizers (i.e. without assuming e.g. boundedness of the level sets). In this paper we look at the projected gradient method for constrained convex minimization. Convergence of the whole sequence to a minimizer assuming only existence of so… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
62
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 91 publications
(62 citation statements)
references
References 13 publications
0
62
0
Order By: Relevance
“…The projected gradient method (cf. [39][40][41] and the references therein) for solving (52) is defined as follows:…”
Section: Finite Termination Of the Projected Gradient Methodsmentioning
confidence: 99%
“…The projected gradient method (cf. [39][40][41] and the references therein) for solving (52) is defined as follows:…”
Section: Finite Termination Of the Projected Gradient Methodsmentioning
confidence: 99%
“…Next we discuss the extent to which vectors y k and x kþ1 satisfying (12)- (14) can be determined through an explicit procedure.…”
Section: Relaxed Projection Methodsmentioning
confidence: 99%
“…P C (x) ¼ argmin y2C kx À yk. See [2,3,14] for convergence properties of this method for the case in which f is convex, which are related to results in this article. An immediate extension of the method (1)- (2) to VIP(T, C ) for the case in which T is point-to-set is the iterative procedure given by…”
Section: Introductionmentioning
confidence: 99%
“…For differentiable functions the projected subgradient method coincides with the projected gradient method and different variants rules are used to choose the stepsizes in order to ensure functional decrease at each iteration; see [13]. Its extension to the generalized convex case is studied in [7,14,21].…”
Section: The Projected Subgradient Methodsmentioning
confidence: 99%