2016
DOI: 10.1007/s13675-015-0045-8
|View full text |Cite
|
Sign up to set email alerts
|

An inertial forward–backward algorithm for the minimization of the sum of two nonconvex functions

Abstract: Abstract. We propose a forward-backward proximal-type algorithm with inertial/memory effects for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. The sequence of iterates generated by the algorithm converges to a critical point of the objective function provided an appropriate regularization of the objective satisfies the Kurdyka-Lojasiewicz inequality, which is for instance fulfilled for semi-algebraic functions. We illustrate the theoretical results by considering two nu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
53
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
7
2
1

Relationship

5
5

Authors

Journals

citations
Cited by 129 publications
(53 citation statements)
references
References 36 publications
0
53
0
Order By: Relevance
“…This can provide better approximation of the local Hessian at each step x k , which typically leads to improved convergence rates. There are several such metric rules proposed for both convex (Chouzenoux et al, 2014;Salzo, 2016;Lee et al, 2014) and nonconvex (Bonettini et al, 2016;Boţ et al, 2016) problems. However, despite the theoretical convergence guarantees, most such proposed rules fall short in practical cases.…”
Section: Related Workmentioning
confidence: 99%
“…This can provide better approximation of the local Hessian at each step x k , which typically leads to improved convergence rates. There are several such metric rules proposed for both convex (Chouzenoux et al, 2014;Salzo, 2016;Lee et al, 2014) and nonconvex (Bonettini et al, 2016;Boţ et al, 2016) problems. However, despite the theoretical convergence guarantees, most such proposed rules fall short in practical cases.…”
Section: Related Workmentioning
confidence: 99%
“…which reads for any n ě 0 y n`1 P prox µ´1G`yn´µ´1 ∇Hpy n q˘, and is nothing else than the proximal-gradient method. An inertial version of the proximal-gradient method for solving (2.3) in the fully nonconvex setting has been considered in [12].…”
Section: The Algorithmmentioning
confidence: 99%
“…Motivated by [11] and [13], we divide the proof into three main steps, which are listed in the following three subsections, respectively.…”
Section: The Convergence Of the Algorithmmentioning
confidence: 99%