2009
DOI: 10.1016/j.acha.2008.07.002
|View full text |Cite
|
Sign up to set email alerts
|

CoSaMP: Iterative signal recovery from incomplete and inaccurate samples

Abstract: Compressive sampling offers a new paradigm for acquiring signals that are compressible with respect to an orthonormal basis. The major algorithmic challenge in compressive sampling is to approximate a compressible signal from noisy samples. This paper describes a new iterative recovery algorithm called CoSaMP that delivers the same guarantees as the best optimization-based approaches. Moreover, this algorithm offers rigorous bounds on computational cost and storage. It is likely to be extremely efficient for p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

26
3,032
2
33

Year Published

2012
2012
2016
2016

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 3,758 publications
(3,182 citation statements)
references
References 30 publications
(60 reference statements)
26
3,032
2
33
Order By: Relevance
“…IP address: 52.183.12.225, on 11 Apr 2019 at 13:59:57, subject to the Cambridge Core of exact recovery of sparse signals and the Lebesgue-type inequalities for these algorithms: the regularized orthogonal matching pursuit (see [12]), the compressive sampling matching pursuit (CoSaMP; see [11]), and the subspace pursuit (SP; see [1]). The OMP is simpler than the CoSaMP and the SP; however, at the time of invention of the CoSaMP and the SP these algorithms provided exact recovery of sparse signals and the Lebesgue-type inequalities for dictionaries satisfying the restricted isometry property (RIP) (see [11] and [1]). The corresponding results for the OMP were not known at that time.…”
Section: Discussionmentioning
confidence: 99%
“…IP address: 52.183.12.225, on 11 Apr 2019 at 13:59:57, subject to the Cambridge Core of exact recovery of sparse signals and the Lebesgue-type inequalities for these algorithms: the regularized orthogonal matching pursuit (see [12]), the compressive sampling matching pursuit (CoSaMP; see [11]), and the subspace pursuit (SP; see [1]). The OMP is simpler than the CoSaMP and the SP; however, at the time of invention of the CoSaMP and the SP these algorithms provided exact recovery of sparse signals and the Lebesgue-type inequalities for dictionaries satisfying the restricted isometry property (RIP) (see [11] and [1]). The corresponding results for the OMP were not known at that time.…”
Section: Discussionmentioning
confidence: 99%
“…Then the following experiments are using the same settings, where the model hyperparameters for the priors in CluSS are set as Then several experiments considered widely in CS literatures are implemented via CluSS, and comparisons are made to the state-of-the-art CS algorithms, respectively, Basis Pursuit (BP) [31], CoSaMP [32], Block-CoSaMP [10], (K, S)-sparse recovery algorithm via Dynamic Programming (Block-DP) [12] and Bayesian Compressive Sensing (BCS) [21]. Without special explanation, the sensing matrix Φ is constructed randomly as in the seminal work [2], i.e., entries are drawn independently from Gaussian distribution…”
Section: Methodsmentioning
confidence: 99%
“…Thus it is an alternative choice in high dynamic environment where it is difficult, if not impossible, to obtain enough number of snapshots for the fixed DOAs. The representative compressive sensing algorithms include convex relaxation algorithms [6,7] and greedy algorithms [8,9]. It has been proved that the compressive sensing algorithms can accurately recover the sparse signal under certain conditions.…”
Section: Introductionmentioning
confidence: 99%