2016
DOI: 10.1088/1742-6596/756/1/012001
|View full text |Cite
|
Sign up to set email alerts
|

On the constrained minimization of smooth Kurdyka—Łojasiewicz functions with the scaled gradient projection method

Abstract: Abstract. The scaled gradient projection (SGP) method is a first-order optimization method applicable to the constrained minimization of smooth functions and exploiting a scaling matrix multiplying the gradient and a variable steplength parameter to improve the convergence of the scheme. For a general nonconvex function, the limit points of the sequence generated by SGP have been proved to be stationary, while in the convex case and with some restrictions on the choice of the scaling matrix the sequence itself… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
4
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 18 publications
(28 reference statements)
0
4
0
Order By: Relevance
“…It is important to emphasize that even for nonsmooth, nonconvex optimization there is a vast amount of recent publications, ranging from forward-backward, respectively proximaltype, schemes [8,9,10,49,50], over linearized proximal schemes [365,47,366,298], to inertial methods [299,309], primal-dual algorithms [361,267,279,34], scaled gradient projection methods [310], nonsmooth Gauß-Newton extensions [149,300] and nonlinear Eigenproblems [206,59,32,51,261,31]. We focus mainly on recent generalizations of the proximal gradient method and the linearized Bregman iteration for nonconvex functionals E in the following.…”
Section: Nonconvex Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…It is important to emphasize that even for nonsmooth, nonconvex optimization there is a vast amount of recent publications, ranging from forward-backward, respectively proximaltype, schemes [8,9,10,49,50], over linearized proximal schemes [365,47,366,298], to inertial methods [299,309], primal-dual algorithms [361,267,279,34], scaled gradient projection methods [310], nonsmooth Gauß-Newton extensions [149,300] and nonlinear Eigenproblems [206,59,32,51,261,31]. We focus mainly on recent generalizations of the proximal gradient method and the linearized Bregman iteration for nonconvex functionals E in the following.…”
Section: Nonconvex Optimizationmentioning
confidence: 99%
“…2017), linearized proximal schemes (Xu and Yin 2013, Bolte, Sabach and Teboulle 2014, Xu and Yin 2017, Nikolova and Tan 2017), inertial methods (Ochs, Chen, Brox and Pock 2014, Pock and Sabach 2016), primal–dual algorithms (Valkonen 2014, Li and Pong 2015, Moeller, Benning, Schönlieb and Cremers 2015, Benning, Knoll, Schönlieb and Valkonen 2015), scaled gradient projection methods (Prato et al. 2016), non-smooth Gauss–Newton extensions (Drusvyatskiy, Ioffe and Lewis 2016, Ochs, Fadili and Brox 2017), and nonlinear eigenproblems (Hein and Bühler 2010, Bresson, Laurent, Uminsky and Brecht 2012, Benning, Gilboa and Schönlieb 2016, Boţ and Csetnek 2017, Laurent, von Brecht, Bresson and Szlam 2016, Benning, Gilboa, Grah and Schönlieb 2017 c ). Here we focus mainly on recent generalizations of the proximal gradient method and the linearized Bregman iteration for non-convex functionals ; a treatment of all the algorithms mentioned above would be a subject for a survey paper in its own right.…”
Section: Advanced Issuesmentioning
confidence: 99%
“…Throughout the last decade, however, there has been an increasing interest in first-order methods for nonconvex and nonsmooth objectives. These methods range from forward-backward, respectively, proximal-type, schemes [2,3,4,18,19], over linearized proximal schemes [80,16,81,61], to inertial methods [63,68], primal-dual algorithms [78,52,57,12], scaled gradient projection methods [69], and nonsmooth Gauß-Newton extensions [35,64].…”
mentioning
confidence: 99%
“…Throughout the last decade, however, there has been an increasing interest in first-order methods for non-convex and non-smooth objectives. These methods range from forward-backward, respectively proximal-type, schemes [2,3,4,17,18], over linearised proximal schemes [74,15,75,57], to inertial methods [59,64], primal-dual algorithms [73,48,53,12], scaled gradient projection methods [65] and non-smooth Gauß-Newton extensions [33,60].…”
mentioning
confidence: 99%