2009
DOI: 10.1109/tit.2009.2032726
|View full text |Cite
|
Sign up to set email alerts
|

Necessary and Sufficient Conditions for Sparsity Pattern Recovery

Abstract: The problem of detecting the sparsity pattern of a k-sparse vector in R n from m random noisy measurements is of interest in many areas such as system identification, denoising, pattern recognition, and compressed sensing. This paper addresses the scaling of the number of measurements m, with signal dimension n and sparsity-level nonzeros k, for asymptotically-reliable detection. We show a necessary condition for perfect recovery at any given SNR for all algorithms, regardless of complexity, is m = Ω(k log(n −… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

14
229
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 183 publications
(243 citation statements)
references
References 28 publications
14
229
0
Order By: Relevance
“…Sufficient and necessary conditions are developed for different sparsity levels. Using the same decoder, Fletcher et al [19] further improved the necessary condition in certain settings. Wang et al [20] also presented a set of necessary conditions for exact support recovery.…”
Section: Introductionmentioning
confidence: 99%
“…Sufficient and necessary conditions are developed for different sparsity levels. Using the same decoder, Fletcher et al [19] further improved the necessary condition in certain settings. Wang et al [20] also presented a set of necessary conditions for exact support recovery.…”
Section: Introductionmentioning
confidence: 99%
“…Recovery of the sparsity pattern with vanishing error probability is studied in a number of recent works such as [1], [2], [14], [27], [39], [40]. When k = n i=1 b i , the number of nonzero coefficients in v, is known beforehand 2 and their magnitude is bounded away from zero, exact support recovery requires that the number of measurements grow as k log n [14], [40].…”
Section: B Existing Resultsmentioning
confidence: 99%
“…When k = n i=1 b i , the number of nonzero coefficients in v, is known beforehand 2 and their magnitude is bounded away from zero, exact support recovery requires that the number of measurements grow as k log n [14], [40]. If the support recovery error rate is allowed to be non-vanishing, fewer measurements are necessary.…”
Section: B Existing Resultsmentioning
confidence: 99%
“…The model (11) for RD-MUD has a similar form to the observation model in the compressed sensing literature [13], [22], except that the noise in the RD-MUD front-end output is colored. Hence, to recover b, we can combine ideas developed in the context of compressed sensing and MUD.…”
Section: B Rd-mud Detectionmentioning
confidence: 99%
“…While such sparsity has been exploited in various detection settings, there is still a gap in applying these ideas to the multiuser setting we consider here. Most existing work on exploiting compressed sensing [11], [12] for signal detection assumes discrete signals and then applies compressed sensing via matrix multiplication [8], [13], [14], [15], [16]. In contrast, in multiuser detection the received signal is continuous.…”
Section: Introduction Multiuser Detection (Mud)mentioning
confidence: 99%