2022
DOI: 10.1109/tsp.2022.3214091
|View full text |Cite
|
Sign up to set email alerts
|

Sample-Efficient Sparse Phase Retrieval via Stochastic Alternating Minimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…Therefore, the sample efficiency of many two-stage algorithms can be improved when combined with the proposed initialization methods. Examples of such algorithms are SPARTA [50], CoPRAM [25], thresholding/projected Wirtinger flow [14,42], SAM [10] and HTP [11], to just name a few. We use the two-stage HTP algorithm [11] as a typical example to illustrate this.…”
Section: Theoretical Guarantee and Sample Complexitymentioning
confidence: 99%
See 2 more Smart Citations
“…Therefore, the sample efficiency of many two-stage algorithms can be improved when combined with the proposed initialization methods. Examples of such algorithms are SPARTA [50], CoPRAM [25], thresholding/projected Wirtinger flow [14,42], SAM [10] and HTP [11], to just name a few. We use the two-stage HTP algorithm [11] as a typical example to illustrate this.…”
Section: Theoretical Guarantee and Sample Complexitymentioning
confidence: 99%
“…Those nonconvex methods are usually divided into two stages, namely, the initialization stage and the local refinement stage. Provided an initial guess that is sufficiently close to the underlying signal, non-convex algorithms including SPARTA [50], CoPRAM [25], thresholding/projected Wirtinger flow [14,42], stochastic alternating minimization (SAM) [10] and hard thresholding pursuit (HTP) [11] are guaranteed to converge at least linearly to the ground truth with the near-optimal sample complexity Ω(s log n). Moreover, algorithms like SAM and HTP are guaranteed to give an exact recovery of the underlying signal in only a few iterations, implying that the local refinement stage can be efficiently implemented.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Next, a series of breakthrough results [9,[13][14][15] provided provably valid algorithmic processes for special cases, where the measurement vectors are randomly derived from some certain multivariate probability distribution, such as the Gaussian distributions. Phase retrieval problems [9,13,14] require the number of observations m to exceed the problem dimension n. However, for the sparse phase retrieval problem, the true signal can be successfully recovered, even if the number of measurements m is less than the length of the signal n. In particular, a recent paper [16] utilized the random sampling technique to achieve the best empirical sampling complexity; in other words, it requires less measurements than state-of-the-art algorithms for sparse phase retrieval problems. For more on phase recovery, interested readers can refer to [17][18][19][20][21][22][23][24][25][26][27].…”
Section: Introductionmentioning
confidence: 99%