2016
DOI: 10.1137/15m1019325
|View full text |Cite
|
Sign up to set email alerts
|

Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization

Abstract: We develop a new proximal-gradient method for minimizing the sum of a differentiable, possibly nonconvex, function plus a convex, possibly non differentiable, function. The key features of the proposed method are the definition of a suitable descent direction, based on the proximal operator associated to the convex part of the objective function, and an Armijo-like rule to determine the step size along this direction ensuring the sufficient decrease of the objective function. In this frame, we especially addre… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
142
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 85 publications
(147 citation statements)
references
References 31 publications
3
142
0
Order By: Relevance
“…Throughout the entire section, {x (k) } k∈N will denote the sequence generated by SGP. The following lemma (whose proof follows from [9…”
Section: Preliminary Resultsmentioning
confidence: 99%
“…Throughout the entire section, {x (k) } k∈N will denote the sequence generated by SGP. The following lemma (whose proof follows from [9…”
Section: Preliminary Resultsmentioning
confidence: 99%
“…The parameter m in Algorithm 1 is typically a small value (m = 3, 4, 5), in order to avoid a significant computational cost in the calculation of the steplengths α (20) is addressed, at each iteration of ILA, by means of algorithm FISTA [43] which is stopped by using criterion (22) with η = 10 −6 . This value represents a good balance between convergence speed and computational time per iteration [14]. Concerning the nonlinear CG methods equipped with the strong Wolfe conditions, we use the same parameters as done in [41] and we initialize the related backtracking procedure as suggested in [24, p. 59].…”
Section: Comparison With State-of-the-art Methodsmentioning
confidence: 99%
“…Recall that the Lipschitz continuity of ∇J implies that there is α min > 0 such that the linesearch parameter α n ≥ α min (see [14,Proposition 4.2] for a proof). Then…”
Section: Endif Endwhilementioning
confidence: 99%
See 1 more Smart Citation
“…However, our approach differs in the objective function, which considers shearlets instead of wavelets, and in the aim, since our goal is to compare the different regularization terms proposed to identify the model which better provides the desired features of the image to reconstruct. Moreover, for the solution of the minimization problems we exploit a very recently proposed algorithm belonging to the class of proximalgradient techniques, the variable metric inexact line-search algorithm (VMILA) [6], in place of a (split) augmented Lagrangian. VMILA is a proximal-gradient method, which enables the inexact computation of the proximal point defining the descent direction and guarantees the sufficient decrease of the objective function by means of an Armijo-like backtracking procedure.…”
Section: Introductionmentioning
confidence: 99%