2021
DOI: 10.48550/arxiv.2112.03860
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differentiable Gaussianization Layers for Inverse Problems Regularized by Deep Generative Models

Abstract: Generative networks such as normalizing flows can serve as a learning-based prior to augment inverse problems to achieve high-quality results. However, the latent space vector may not remain a typical sample from the desired high-dimensional standard Gaussian distribution when traversing the latent space during an inversion. As a result, it can be challenging to attain a high-fidelity solution, particularly in the presence of noise and inaccurate physics-based models. To address this issue, we propose to re-pa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 37 publications
(47 reference statements)
0
0
0
Order By: Relevance