2017
DOI: 10.1088/1361-6420/aa5bfd
|View full text |Cite
|
Sign up to set email alerts
|

On the convergence of a linesearch based proximal-gradient method for nonconvex optimization

Abstract: We consider a variable metric linesearch based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a critical point if the objective function satisfies the Kurdyka-Lojasiewicz property at each point of its domain, under the assumption that a limit point exists. The proposed method is applied to a wide collection of image processing problems and our numerical tests show that … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
64
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 53 publications
(65 citation statements)
references
References 57 publications
1
64
0
Order By: Relevance
“…The condition (3) is studied in [2,Section 4.1], which applies a primal-dual approach to (2) to satisfy it. In this connection, note that if we have access to a lower bound Q LB ≤ Q * (obtained by finding a feasible point for the dual of (2), or other means), then any d satisfying Q(d) ≤ (1 − η)Q LB also satisfies (3).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…The condition (3) is studied in [2,Section 4.1], which applies a primal-dual approach to (2) to satisfy it. In this connection, note that if we have access to a lower bound Q LB ≤ Q * (obtained by finding a feasible point for the dual of (2), or other means), then any d satisfying Q(d) ≤ (1 − η)Q LB also satisfies (3).…”
mentioning
confidence: 99%
“…In practical situations, we need not enforce (3) explicitly for some chosen value of η. In fact, we do not necessarily require η to be known, or (3) to be checked at all. Rather, we can take advantage of the convergence rates of whatever solver is applied to (2) to ensure that (3) holds for some value of η ∈ (0, 1), possibly unknown.…”
mentioning
confidence: 99%
“…also. It follows from the continuity of the operations in the right hand sides of algorithm (14) that (u † , v † ) satisfies the equations (16), which characterize the minimizers of problem (4). One can then replace (û,v) by (u † , v † ) in inequality (27) to obtain…”
Section: Lemmamentioning
confidence: 99%
“…Proof:As f is strongly convex, problem (4) is guaranteed to have a (unique) solutionû. Hence, there existsv such that equations (16) are satisfied. We start from inequality (24) derived in the proof of Theorem 1.…”
Section: Lemmamentioning
confidence: 99%
“…In addition, the convergence of Algorithm 1 can be asserted whenever the objective function J satisfies the KurdykaLojasiewicz (KL) property [31,32] at each point of its domain. More precisely, as shown in a number of recent papers [33,34,35], one can prove the convergence of a sequence {φ (n) } n∈N to a limit point (if any exists) which is stationary for J if the following three conditions are satisfied:…”
Section: Solve the Linear System Rmentioning
confidence: 99%