2010 IEEE International Conference on Acoustics, Speech and Signal Processing 2010
DOI: 10.1109/icassp.2010.5495919
|View full text |Cite
|
Sign up to set email alerts
|

Coherence-based near-oracle performance guarantees for sparse estimation under Gaussian noise

Abstract: We consider the problem of estimating a deterministic sparse vector x 0 from underdetermined measurements Ax 0 + w, where w represents white Gaussian noise and A is a given deterministic dictionary. We analyze the performance of three sparse estimation algorithms: basis pursuit denoising, orthogonal matching pursuit, and thresholding. These approaches are shown to achieve near-oracle performance with high probability, assuming that x 0 is sufficiently sparse. Our results are non-asymptotic and are based only o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0

Year Published

2010
2010
2014
2014

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 9 publications
0
11
0
Order By: Relevance
“…Most algorithms provide the acceptable bound for the error between y and s [17][18][19][20][21][22][23][24][25][26]. The error bound is created based on the noise characteristic such as bounded noise, Gaussian noise, finite variance noise, etc.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Most algorithms provide the acceptable bound for the error between y and s [17][18][19][20][21][22][23][24][25][26]. The error bound is created based on the noise characteristic such as bounded noise, Gaussian noise, finite variance noise, etc.…”
Section: Introductionmentioning
confidence: 99%
“…ROMP [20,26] and compressed sensing matching pursuit (CoSaMP) [24,26] have the stability guarantee as the ℓ 1 -minimization method and provide the speed as greedy algorithm. In [25], the authors used the mutual coherence of the matrix to analyze the performance of BPDN, OMP, and iterative hard thresholding (ITH) when y was corrupted by Gaussian noise. The equivalent of cost function in BPDN was solved through ITH in [27].…”
Section: Introductionmentioning
confidence: 99%
“…The final step is the reconstruction. There are two major reconstruction approaches: L 1 -minimization [5][6][7][8] and greedy algorithm [10][11][12][13][14]31]. Convex optimization is applied in the reconstruction by L 1 -minimization approach.…”
Section: Compressed Sensingmentioning
confidence: 99%
“…The error bound is set based on the noise characteristics, such as bounded noise, Gaussian noise, finite variance noise, etc. [5][6][7][8][9][10][11][12][13][14]. L 0 norm in Equation (4) is relaxed to L 1 norm in the reconstruction by Basis Pursuit Denoising (BPDN), whereas it is replaced by heuristic rules in the reconstruction by greedy algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…The bound of the probability of error is obtained by utilizing the oracle estimator, which knows the localizations of the nonzero elements of the sparse signal s [116]. It has been pointed out in [9,117] that once the nonzero localizations of signal s is not provided and the estimation is under Gaussian noise, there always exists a log m gap between the performance of the oracle estimator and the performance of any practical estimator, and no recovery algorithm would perform better than the result obtained by the oracle estimator. Based on this property and utilizing the Bayesian approach for detection, we propose the theoretical bound of the probability of error in CS where the detection is based on the signal reconstructed from linear measurements and the estimation is under the white Gaussian noise.…”
Section: Introductionmentioning
confidence: 99%