2015
DOI: 10.1109/tsp.2015.2441032
|View full text |Cite
|
Sign up to set email alerts
|

Iterative Reweighted <formula formulatype="inline"><tex Notation="TeX">$\ell_{2}/\ell_{1}$</tex> </formula> Recovery Algorithms for Compressed Sensing of Block Sparse Signals

Abstract: In many applications of compressed sensing the signal is block sparse, i.e., the non-zero elements of the sparse signal are clustered in blocks. Here, we propose a family of iterative algorithms for the recovery of block sparse signals. These algorithms, referred to as iterative reweighted 2/ 1 minimization algorithms (IR-2/ 1), solve a weighted 2/ 1 minimization in each iteration. Our simulation and analytical results on the recovery of both ideally and approximately block sparse signals show that the propose… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(1 citation statement)
references
References 49 publications
0
1
0
Order By: Relevance
“…IRLS is a practical framework for the optimization of non-smooth, possibly non-convex, high-dimensional objectives that minimizes quadratic models which majorize these objectives. Due to its ease of implementation and favorable data-efficiency, it has been widely used in compressed sensing [GR97, CY08, DDFG10, LXY13, FPRW16, KMVS21], robust statistics [HW77, AH15, MGJK19], computer vision [CG17, LC22, SWL22], low-rank matrix recovery and completion [FRW11, MF12, KS18, KMV21], and in inverse problems involving group sparsity [CHHL14, ZB15,CHLH18]. Recently, it has been shown [LK23] that dictionary learning techniques can be incorporated into IRLS schemes for sparse and low-rank recovery to allow the learning of a sparsifying dictionary while recovering the solution.…”
mentioning
confidence: 99%
“…IRLS is a practical framework for the optimization of non-smooth, possibly non-convex, high-dimensional objectives that minimizes quadratic models which majorize these objectives. Due to its ease of implementation and favorable data-efficiency, it has been widely used in compressed sensing [GR97, CY08, DDFG10, LXY13, FPRW16, KMVS21], robust statistics [HW77, AH15, MGJK19], computer vision [CG17, LC22, SWL22], low-rank matrix recovery and completion [FRW11, MF12, KS18, KMV21], and in inverse problems involving group sparsity [CHHL14, ZB15,CHLH18]. Recently, it has been shown [LK23] that dictionary learning techniques can be incorporated into IRLS schemes for sparse and low-rank recovery to allow the learning of a sparsifying dictionary while recovering the solution.…”
mentioning
confidence: 99%