2021
DOI: 10.48550/arxiv.2102.12643
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Provable Compressed Sensing with Generative Priors via Langevin Dynamics

Abstract: Deep generative models have emerged as a powerful class of priors for signals in various inverse problems such as compressed sensing, phase retrieval and super-resolution. Here, we assume an unknown signal to lie in the range of some pre-trained generative model. A popular approach for signal recovery is via gradient descent in the low-dimensional latent space. While gradient descent has achieved good empirical performance, its theoretical behavior is not well understood. In this paper, we introduce the use of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 25 publications
(49 reference statements)
0
2
0
Order By: Relevance
“…Two further works gave guarantees that require neither random weights nor exact projections, but instead adopt further deterministic assumptions on G, roughly amounting to certain forms of smoothness. Specifically, [46] studied an algorithm based on Alternating Direction Method-of-Multipliers (ADMM), and [100] studied an algorithm based on Langevin dynamics. Algorithms based on Langevin dynamics have also been explored in several other works on inverse problems and beyond, e.g., see [67] for its use in posterior sampling with general probabilistic priors, and [106] for a general nonasymptotic analysis under non-convex objectives.…”
Section: E Further Developmentsmentioning
confidence: 99%
“…Two further works gave guarantees that require neither random weights nor exact projections, but instead adopt further deterministic assumptions on G, roughly amounting to certain forms of smoothness. Specifically, [46] studied an algorithm based on Alternating Direction Method-of-Multipliers (ADMM), and [100] studied an algorithm based on Langevin dynamics. Algorithms based on Langevin dynamics have also been explored in several other works on inverse problems and beyond, e.g., see [67] for its use in posterior sampling with general probabilistic priors, and [106] for a general nonasymptotic analysis under non-convex objectives.…”
Section: E Further Developmentsmentioning
confidence: 99%
“…Two further works gave guarantees that require neither random weights nor exact projections, but instead adopt further deterministic assumptions on G, roughly amounting to certain forms of smoothness. Specifically, [42] studied an algorithm based on Alternating Direction Method-of-Multipliers (ADMM), and [93] studied an algorithm based on Langevin dynamics.…”
Section: E Further Developmentsmentioning
confidence: 99%