2020
DOI: 10.48550/arxiv.2002.00107
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generative Modeling with Denoising Auto-Encoders and Langevin Sampling

Abstract: We study convergence of a generative modeling method that first estimates the score function of the distribution using Denoising Auto-Encoders (DAE) or Denoising Score Matching (DSM) and then employs Langevin diffusion for sampling. We show that both DAE and DSM provide estimates of the score of the Gaussian smoothed population density, allowing us to apply the machinery of Empirical Processes. We overcome the challenge of relying only on L 2 bounds on the score estimation error and provide finite-sample bound… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
20
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(21 citation statements)
references
References 11 publications
1
20
0
Order By: Relevance
“…Finally, another notable paper close in spirit to our goal is Block et al [2020], which provides a detailed theoretical analysis of a score-matching generative model using Denoising Autoencoders followed by Langevin diffusion. While their work makes generally weaker assumptions and also includes a non-asymptotic analysis of the sampling algorithm, the resulting rates are unsuprisingly cursed by dimension.…”
Section: Related Workmentioning
confidence: 93%
See 1 more Smart Citation
“…Finally, another notable paper close in spirit to our goal is Block et al [2020], which provides a detailed theoretical analysis of a score-matching generative model using Denoising Autoencoders followed by Langevin diffusion. While their work makes generally weaker assumptions and also includes a non-asymptotic analysis of the sampling algorithm, the resulting rates are unsuprisingly cursed by dimension.…”
Section: Related Workmentioning
confidence: 93%
“…Such sampling procedures may be difficult in general, particularly for complex energy landscapes, thus we also consider different estimators based on un-normalized measures which avoid the need of sampling. We focus here on approaches based on minimizing Stein discrepancies Mackey, 2015, Liu andWang, 2016], which have recently been found to be useful in deep generative models [Grathwohl et al, 2020], though we note that alternative approaches may be used, such as score matching [Hyvärinen, 2005, Song and Kingma, 2021, Song and Ermon, 2019, Block et al, 2020.…”
Section: Introductionmentioning
confidence: 99%
“…However, the main difference lies that the drift terms of the Langevin SDEs in (Song & Ermon, 2019;Block et al, 2020) are time-invariant in contrast to the time-varying drift term in our formulation. As shown in Theorem 9, the benefit of the time-varying drift term is essential: the SDE of Schrödinger Bridge runs on a unit time interval [0, 1] will recover the target distribution at the terminal time.…”
Section: Related Workmentioning
confidence: 98%
“…The consistency of the proposed Schrödinger Bridge is mainly based on mild assumptions (such as smoothness and boundedness) without some restricted technical requirements that the target distribution has to be log-concave or fulfill the log-Sobolev inequality (Gao et al, 2020;Arbel et al, 2019;Liutkus et al, 2019;Block et al, 2020).…”
Section: Now We Establish the Consistency Of Our Schrödinger Bridgementioning
confidence: 99%
See 1 more Smart Citation