2019
DOI: 10.48550/arxiv.1907.11331
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improved Bounds for Discretization of Langevin Diffusions: Near-Optimal Rates without Convexity

Abstract: We present an improved analysis of the Euler-Maruyama discretization of the Langevin diffusion. Our analysis does not require global contractivity, and yields polynomial dependence on the time horizon. Compared to existing approaches, we make an additional smoothness assumption, and improve the existing rate from O(η) to O(η 2 ) in terms of the KL divergence. This result matches the correct order for numerical SDEs, without suffering from exponential time dependence. When applied to algorithms for sampling and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
27
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(32 citation statements)
references
References 17 publications
5
27
0
Order By: Relevance
“…The results significantly improve those obtained e.g. in [DM17] or in [Dal17] in terms of pγ n q ně1 and seem quite consistent with more recent works (by totally different methods) like [MFWB19] (see Remark 2.4 for details).…”
Section: supporting
confidence: 91%
See 1 more Smart Citation
“…The results significantly improve those obtained e.g. in [DM17] or in [Dal17] in terms of pγ n q ně1 and seem quite consistent with more recent works (by totally different methods) like [MFWB19] (see Remark 2.4 for details).…”
Section: supporting
confidence: 91%
“…[Pel96], [MP96]) and more recently in a series of papers, still in the additive setting, motivated by applications in machine learning (in particular in Bayesian or PAC-Bayesian statistics). Among others, we refer to [DM17,Dal17,MFWB19] and to the references therein.…”
Section: ¯´1mentioning
confidence: 99%
“…The study of the long-time behavior of Euler-Maruyama schemes of (6) has been the topic of numerous papers in the last years. Among others, we can refer to [DM15, DM17, DMM19, Dal17, DK19,MFWB19] where the authors generally focus on the (Wasserstein, Total Variation,. .…”
Section: Ergodic Approximation and Gibbs Approximationmentioning
confidence: 99%
“…One notable example is the Langevin Monte Carlo algorithm (LMC), which corresponds to Euler-Maruyama discretization of the overdamped Langevin equation. Its study dated back to at least the 90s (Roberts et al, 1996) but keeps on leading to important discoveries, for example, on non-asymptotics and dimension dependence, which are relevant to machine learning (e.g., Dalalyan (2017a,b); Cheng et al (2018a); ; Durmus and Moulines (2019); Vempala and Wibisono (2019); Dalalyan and Riou-Durand (2020); Li et al (2019); Erdogdu and Hosseinzadeh (2020); Mou et al (2019)). LMC is closely related to SGD too (e.g., Mandt et al (2017)).…”
Section: Introductionmentioning
confidence: 99%