2022
DOI: 10.1016/j.acha.2021.10.002
|View full text |Cite
|
Sign up to set email alerts
|

Sparse signal recovery from phaseless measurements via hard thresholding pursuit

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(22 citation statements)
references
References 38 publications
0
10
0
Order By: Relevance
“…Actually, the sample complexity of this approach is governed by the gap between diagonals [EY] jj for j ∈ S and j ∈ S c as in the following (11) finally leads to the sample complexity m = Ω(s 2 log n) for spectral initialization [25].…”
Section: Spectral Initialization For Sparse Phase Retrievalmentioning
confidence: 99%
See 1 more Smart Citation
“…Actually, the sample complexity of this approach is governed by the gap between diagonals [EY] jj for j ∈ S and j ∈ S c as in the following (11) finally leads to the sample complexity m = Ω(s 2 log n) for spectral initialization [25].…”
Section: Spectral Initialization For Sparse Phase Retrievalmentioning
confidence: 99%
“…Those nonconvex methods are usually divided into two stages, namely, the initialization stage and the local refinement stage. Provided an initial guess that is sufficiently close to the underlying signal, non-convex algorithms including SPARTA [50], CoPRAM [25], thresholding/projected Wirtinger flow [14,42], stochastic alternating minimization (SAM) [10] and hard thresholding pursuit (HTP) [11] are guaranteed to converge at least linearly to the ground truth with the near-optimal sample complexity Ω(s log n). Moreover, algorithms like SAM and HTP are guaranteed to give an exact recovery of the underlying signal in only a few iterations, implying that the local refinement stage can be efficiently implemented.…”
Section: Introductionmentioning
confidence: 99%
“…Let consider a signal f that is sparse, x has only K non-zero entries. f = ψ x, where ψ is some transform domain [20], [21]. The measurement vector can be obtained as 𝑦 = 𝜙𝑓 = 𝜙𝜓𝑥 = 𝐴𝑥 .…”
Section: D((imentioning
confidence: 99%
“…Sparse phase retrieval: Various algorithms have also been devised for sparse phase retrieval [25]- [27]. In particular, there exist various non-convex optimization based algorithms, including thresholded/projected WF [24], [28], sparse truncated amplitude flow [29], compressive phase retrieval with alternating minimization [30], and sparse phase retrieval by hard iterative pursuit [31]. All of these approaches are analyzed under the assumption of using a sensing matrix with i.i.d.…”
Section: A Related Workmentioning
confidence: 99%
“…Without loss of generality, we assume that |S| ≥ m 2 (otherwise, we have |T | ≥ m 2 and we can derive an upper bound for q + x 2 instead of for q − x 2 ). Expanding the squares in (31), we obtain…”
Section: B Proof Of Theoremmentioning
confidence: 99%