1998
DOI: 10.1007/bf01584842
|View full text |Cite
|
Sign up to set email alerts
|

On the projected subgradient method for nonsmooth convex optimization in a Hilbert space

Abstract: We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an ek-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes ek are exogenously given, satisfying ~=0 c~k ec, ~=0 c~ < ec, and ek is chosen so that ek ~ 0. We prove that the sequence generated in this way is weakly convergent to a minimizer if the problem has solutions, and is unbounded … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
106
0
1

Year Published

1998
1998
2023
2023

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 107 publications
(127 citation statements)
references
References 10 publications
1
106
0
1
Order By: Relevance
“…In this section, we consider a parallel generalized gradient-type projection method (GGPM) for solving the problem of minimizing an additive parametric objective function (2). The type of parallelization proposed here is primarily motivated by incremental gradient methods, particularly neural network training (Ref.…”
Section: Convergence Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we consider a parallel generalized gradient-type projection method (GGPM) for solving the problem of minimizing an additive parametric objective function (2). The type of parallelization proposed here is primarily motivated by incremental gradient methods, particularly neural network training (Ref.…”
Section: Convergence Analysismentioning
confidence: 99%
“…In the analysis of such methods, it is typically assumed that the e-subgradients become asymptotically exact (Ref. 2). In this paper, we do not assume convexity and assume that the error terms are merely bounded.…”
mentioning
confidence: 99%
“…Subgradients do not give raise to descent directions, so that Armijo searches are not ensured to succeed, and therefore exogenous stepsizes seem to be the only available alternative. This is the case analyzed in [1]. We will not be concerned with option (iii) in the sequel.…”
Section: The Projected Gradient Methodsmentioning
confidence: 99%
“…When f is convex, the stronger results for the unconstrained case, with β k 's given by (4)- (5), have also been extended to the projected gradient method under option (iii): it has been proved in [1] that in such a case, the whole sequence {x k } converges to a solution of problem (1)- (2) under the sole assumption of existence of solutions. On the other hand, the current situation is rather worse for options (i) and (ii): as far as we know, neither existence nor uniqueness of cluster points for options (i) and (ii) has been proved, assuming only convexity of f .…”
Section: The Projected Gradient Methodsmentioning
confidence: 99%
See 1 more Smart Citation