2020
DOI: 10.4310/cms.2020.v18.n1.a10
|View full text |Cite
|
Sign up to set email alerts
|

An efficient and globally convergent algorithm for $\ell_{p,q} - \ell_r$ model in group sparse optimization

Abstract: Group sparsity combines the underlying sparsity and group structure of the data in problems. We develop a proximally linearized algorithm InISSAPL for the non-Lipschitz group sparse p,qr optimization problem. The algorithm gives a unified framework for all the parameters p ≥ 1, 0 < q < 1, 1 ≤ r ≤ ∞, which is applicable to different kinds of measurement noise. In particular, it includes the addition of the non-smooth 1,q regularization term and the non-smooth 1 / ∞ fidelity term as special cases. It allows an i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(18 citation statements)
references
References 37 publications
0
18
0
Order By: Relevance
“…Theorem 3.1 naturally motivates an iterative support shrinkage procedure, like [30,44,45,47] for different signal and image recovery problems. Given u k , it computes u k+1 by solving the following constrained problem:…”
Section: The Algorithmmentioning
confidence: 99%
“…Theorem 3.1 naturally motivates an iterative support shrinkage procedure, like [30,44,45,47] for different signal and image recovery problems. Given u k , it computes u k+1 by solving the following constrained problem:…”
Section: The Algorithmmentioning
confidence: 99%
“…We mention that this support shrinking strategy was recently derived and used with the iteratively reweighted ℓ 1 [10,24] or least squares [18,34] algorithmic structure, for different signal and image processing problems [23,38,53,55,57,60].…”
Section: Motivation and Algorithmmentioning
confidence: 99%
“…In [55,57], the authors proposed a linearized proximal technique with an exact subsolver for each outer iteration. Inspired by [3,38,53,58,60] for different minimization problems, we propose the following inexact iterative support shrinking algorithm with linearization and projection, which is expected to solve the inner subproblem inexactly and terminate it by a subgradient error criterion. Please note that we recall the indicator function of a set U as…”
Section: Motivation and Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…The data fidelity with form Ax−b 1 has also been evidently shown to be more robust when the noises are not normal but heavy-tailed or heterogeneous, see e.g., ; Lu (2014); Wang (2013); Xiu et al (2018). Besides, the data fidelity with form Ax − b ∞ is also known to be very suitable for dealing with the uniformly distributed noise and quantization error, see e.g., Wen et al (2018); Xue et al (2019); Zhang and Wei (2015). Therefore, a more natural question is whether or not one can design a more flexible and robust reconstruction model as well as an efficient algorithm which is capable of dealing with all the three types of noise mentioned above?…”
Section: Introductionmentioning
confidence: 99%