2022
DOI: 10.1016/j.apnum.2022.07.008
|View full text |Cite
|
Sign up to set email alerts
|

Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 61 publications
0
2
0
Order By: Relevance
“…For standard L-BFGS the use of a diagonal seed matrix other than a scaled identity has been studied in [37,24,51,49,42,32,12,16,35,8], but convergence is usually shown for strongly convex objectives or not at all, except in [35], where global convergence is proved for the non-convex case in the sense that lim inf k→∞ ∥∇J (x k )∥ = 0. In contrast, we have lim k→∞ ∥∇J (x k )∥ = 0 in that case and a linear rate of convergence, cf.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…For standard L-BFGS the use of a diagonal seed matrix other than a scaled identity has been studied in [37,24,51,49,42,32,12,16,35,8], but convergence is usually shown for strongly convex objectives or not at all, except in [35], where global convergence is proved for the non-convex case in the sense that lim inf k→∞ ∥∇J (x k )∥ = 0. In contrast, we have lim k→∞ ∥∇J (x k )∥ = 0 in that case and a linear rate of convergence, cf.…”
Section: Related Workmentioning
confidence: 99%
“…4) The step size α k is, for all k, computed by Armijo with backtracking (7) or according to the WolfeŰPowell conditions (8). In the Ąrst case, we suppose in addition that there is δ > 0 such that J or ∇J is uniformly continuous in Ω δ .…”
Section: Global Convergence Of Algorithm Rosementioning
confidence: 99%
“…where t > 0 is the Dai-Liao parameter [19]. In another attempt, Babaie-Kafaki et al [11] suggested the following penalized version of (2.5): min…”
Section: A Diagonal Admm-based Quasi-newton Updatementioning
confidence: 99%
“…It can be observed that MMLBFGS and MLMBFGS generate descent directions regardless of the line search. Furthermore, we consider the diagonal QN methods proposed in [6], [11] and [26] in our comparisons, here respectively named by DQNBN1, DQNBN2 and DQNMSG. For DQNADMM, we set ρ = 10 and µ 0 = 1 in (2.9).…”
Section: Numerical Experimentsmentioning
confidence: 99%
See 1 more Smart Citation