2009
DOI: 10.1109/tit.2009.2016006
|View full text |Cite
|
Sign up to set email alerts
|

Subspace Pursuit for Compressive Sensing Signal Reconstruction

Abstract: Abstract-We propose a new method for reconstruction of sparse signals with and without noisy perturbations, termed the subspace pursuit algorithm. The algorithm has two important characteristics: low computational complexity, comparable to that of orthogonal matching pursuit techniques when applied to very sparse signals, and reconstruction accuracy of the same order as that of LP optimization methods. The presented analysis shows that in the noiseless setting, the proposed algorithm can exactly reconstruct ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

13
1,623
3
6

Year Published

2011
2011
2019
2019

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 2,094 publications
(1,645 citation statements)
references
References 21 publications
13
1,623
3
6
Order By: Relevance
“…IP address: 52.183.12.225, on 11 Apr 2019 at 13:59:57, subject to the Cambridge Core of exact recovery of sparse signals and the Lebesgue-type inequalities for these algorithms: the regularized orthogonal matching pursuit (see [12]), the compressive sampling matching pursuit (CoSaMP; see [11]), and the subspace pursuit (SP; see [1]). The OMP is simpler than the CoSaMP and the SP; however, at the time of invention of the CoSaMP and the SP these algorithms provided exact recovery of sparse signals and the Lebesgue-type inequalities for dictionaries satisfying the restricted isometry property (RIP) (see [11] and [1]). The corresponding results for the OMP were not known at that time.…”
Section: Discussionmentioning
confidence: 99%
“…IP address: 52.183.12.225, on 11 Apr 2019 at 13:59:57, subject to the Cambridge Core of exact recovery of sparse signals and the Lebesgue-type inequalities for these algorithms: the regularized orthogonal matching pursuit (see [12]), the compressive sampling matching pursuit (CoSaMP; see [11]), and the subspace pursuit (SP; see [1]). The OMP is simpler than the CoSaMP and the SP; however, at the time of invention of the CoSaMP and the SP these algorithms provided exact recovery of sparse signals and the Lebesgue-type inequalities for dictionaries satisfying the restricted isometry property (RIP) (see [11] and [1]). The corresponding results for the OMP were not known at that time.…”
Section: Discussionmentioning
confidence: 99%
“…Note that many greedy algorithms have these properties (e.g. [12,19,20]), but relaxation techniques such as ℓ 1 -minimization [5] or the Dantzig selector [21] are not guaranteed to give a k-sparse result.…”
Section: Rip Dictionariesmentioning
confidence: 99%
“…In sparse coding stage, large number of l 1 minimization and greedy algorithms (e.g. BP [1], OMP [3] and SP [4]) have been designed in order to solve the same question: fix dictionary D and update coefficient X by min X Y − DX 2 F . However algorithms on dictionary update concern about different targets to update.…”
Section: Dictionary Learning and The Framework Of Simcomentioning
confidence: 99%
“…Algorithms developed for sparse coding include Basis Pursuit (BP) [1], Matching Pursuit (MP) [2], Orthogonal Matching Pursuit (OMP) [3], Subspace Pursuit (SP) [4], Gradient Pursuit (GP) [5], etc.…”
Section: Introductionmentioning
confidence: 99%