2014
DOI: 10.1007/s10915-014-9930-1
|View full text |Cite
|
Sign up to set email alerts
|

Computing Sparse Representation in a Highly Coherent Dictionary Based on Difference of $$L_1$$ L 1 and $$L_2$$ L 2

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
100
0
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 152 publications
(104 citation statements)
references
References 21 publications
3
100
0
1
Order By: Relevance
“…The former is recently proposed in [22,40] as an alternative to L 1 for CS, and the latter is often used in statistics and machine learning [33,41]. Numerical simulations show that L 1 minimization often fails when MSF < 1, in which case we demonstrate that both L 1−2 and CL 1 outperform the classical L 1 method.…”
Section: Our Contributionsmentioning
confidence: 73%
See 1 more Smart Citation
“…The former is recently proposed in [22,40] as an alternative to L 1 for CS, and the latter is often used in statistics and machine learning [33,41]. Numerical simulations show that L 1 minimization often fails when MSF < 1, in which case we demonstrate that both L 1−2 and CL 1 outperform the classical L 1 method.…”
Section: Our Contributionsmentioning
confidence: 73%
“…This method gives better results than the classical L 1 approaches in the RIP regime and/or incoherent scenario, but it does not work so well for highly coherent CS, as observed in [21,22,40].…”
Section: P Via Irlsmentioning
confidence: 94%
“…However, since the ℓ 0 -regularized problem is computational NP-hard, its ℓ 1 -norm relaxed version is usually considered in practice. Recently ℓ 1−2 regularization has been proposed (Esser et al, 2013; Lou et al, 2014; Yin et al, 2014), and has been shown to provide a sparser result than the widely used ℓ 1 -norm regularization.…”
Section: Methodsmentioning
confidence: 99%
“…which has shown potential in image processing and compressive sensing reconstruction (Lou et al, 2014; Yin et al, 2015) in terms of sparsity and fast convergence. It promotes sparsity of an image, and achieves the smallest value when only one voxel in the image is non-zero.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation