2016
DOI: 10.1016/j.acha.2016.03.002
|View full text |Cite
|
Sign up to set email alerts
|

Hard thresholding pursuit algorithms: Number of iterations

Abstract: The Hard Thresholding Pursuit algorithm for sparse recovery is revisited using a new theoretical analysis. The main result states that all sparse vectors can be exactly recovered from compressive linear measurements in a number of iterations at most proportional to the sparsity level as soon as the measurement matrix obeys a certain restricted isometry condition. The recovery is also robust to measurement error. The same conclusions are derived for a variation of Hard Thresholding Pursuit, called Graded Hard T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
69
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 62 publications
(69 citation statements)
references
References 14 publications
(26 reference statements)
0
69
0
Order By: Relevance
“…To relax the problem (16), an immediate idea is to replace the binary constraint w ∈ {0, 1} n with the simple restriction w ∈ [0, 1] n . In other words, we replace the feasible set W (k) of (16) with the polytope P defined in (11). From the proof of Lemma 2.2, we see that P is the convex hull, i.e., the tightest convex relaxation of W (k) .…”
Section: Optimal K-thresholding Algorithms and Their Relaxationsmentioning
confidence: 99%
See 2 more Smart Citations
“…To relax the problem (16), an immediate idea is to replace the binary constraint w ∈ {0, 1} n with the simple restriction w ∈ [0, 1] n . In other words, we replace the feasible set W (k) of (16) with the polytope P defined in (11). From the proof of Lemma 2.2, we see that P is the convex hull, i.e., the tightest convex relaxation of W (k) .…”
Section: Optimal K-thresholding Algorithms and Their Relaxationsmentioning
confidence: 99%
“…The problems (1) and (2) are NP-hard in general [46]. The plausible algorithms for such problems can be briefly categorized into the following classes: (i) Convex optimization methods (e.g., 1 -minimization [19], reweighed 1 -minimization [16,31,56], and dual-density-based reweighted 1 -minimization [53,54,55]); (ii) heuristic methods (such as matching pursuit [43], orthogonal matching pursuit [44,52], compressive sampling matching pursuit [47], and subspace pursuit [20]); (iii) thresholding methods (e.g., soft thresholding [21,22,24], hard thresholding [6,7,8,30], graded hard thresholding pursuits [10,11], and the 'firm' thresholding [51]); (iv) integer programming methods [4].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Here γ = sgn(B T βi,j (I − B βi,I (B T βi,I B βi,I ) −1 B T βi,I )y βi ), unless j was contained in the support of the solution at the previous step. In this case, we take γ with an opposite sign to the enumerator of the first case in (6).…”
Section: Multi-parameter Regularizationmentioning
confidence: 99%
“…Indeed, provided with the signal's support, the signal entries can be easily recovered with optimal statistical rate [3]. Therefore, support recovery has been a topic of active and fruitful research in the last years [4,5,6]. One typically considers linear observation model problems of the form (1) Au…”
mentioning
confidence: 99%