2015
DOI: 10.1088/0266-5611/31/9/095008
|View full text |Cite
|
Sign up to set email alerts
|

New convergence results for the scaled gradient projection method

Abstract: Abstract. The aim of this paper is to deepen the convergence analysis of the scaled gradient projection (SGP) method, proposed by Bonettini et al. in a recent paper for constrained smooth optimization. The main feature of SGP is the presence of a variable scaling matrix multiplying the gradient, which may change at each iteration. In the last few years, an extensive numerical experimentation showed that SGP equipped with a suitable choice of the scaling matrix is a very effective tool for solving large scale v… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
59
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 53 publications
(61 citation statements)
references
References 53 publications
2
59
0
Order By: Relevance
“…, which still guarantees the stationarity of the limit points [8]. The aim of this paper is to prove the convergence of this modified SGP scheme if the (nonconvex) objective function Ψ satisfies the Kurdyka-Lojasiewicz (KL) property [15,14], which holds true for most of the functions commonly used in inverse problems as p norms, Kullback-Leibler divergence and indicator functions of box plus equality constraints.…”
Section: The Scaled Gradient Projection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…, which still guarantees the stationarity of the limit points [8]. The aim of this paper is to prove the convergence of this modified SGP scheme if the (nonconvex) objective function Ψ satisfies the Kurdyka-Lojasiewicz (KL) property [15,14], which holds true for most of the functions commonly used in inverse problems as p norms, Kullback-Leibler divergence and indicator functions of box plus equality constraints.…”
Section: The Scaled Gradient Projection Methodsmentioning
confidence: 99%
“…[4]. Convergence of the sequence to a minimum point of (1) has recently been proved for convex objective functions by choosing suitable adaptive bounds for the eigenvalues of the scaling matrices [8]. In the following, we will consider a modified version of SGP in which, at each iteration k ∈ N, we compute…”
Section: The Scaled Gradient Projection Methodsmentioning
confidence: 99%
“…In general, problem (2) does not have a closed-form solution on account of the inequality constraints, even for simple regularizations, hence an iterative solver must be used. Several resolution approaches are available, either based on projected gradient strategies [39,40], ADMM [41], primal-dual schemes [42], or interior point techniques [43]. Standard interior point methods require to invert several n × n linear systems, which leads to a high computational complexity for large scale problems.…”
Section: Interior Point Approachesmentioning
confidence: 99%
“…The projected quasi-Newton (PQN) algorithm [65,64] is perhaps the most elegant and logical extension of quasi-Newton methods, but it involves solving a sub-iteration or need to be restricted to a diagonal metric in the implementation [13,12]. PQN proposes the SPG [8] algorithm for the subproblems, and finds that this is an efficient trade-off whenever the cost function (which is not involved in the sub-iteration) is significantly more expensive to evaluate than projecting onto the constraints.…”
Section: Relation To Prior Workmentioning
confidence: 99%