2018
DOI: 10.48550/arxiv.1809.04618
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Global Convergence of Stochastic Gradient Hamiltonian Monte Carlo for Non-Convex Stochastic Optimization: Non-Asymptotic Performance Bounds and Momentum-Based Acceleration

Abstract: Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is a variant of stochastic gradient with momentum where a controlled and properly scaled Gaussian noise is added to the stochastic gradients to steer the iterates towards a global minimum. Many works reported its empirical success in practice for solving stochastic non-convex optimization problems, in particular it has been observed to outperform overdamped Langevin Monte Carlo-based methods such as stochastic gradient Langevin dynamics (SGLD) in many applica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

3
45
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 17 publications
(48 citation statements)
references
References 47 publications
3
45
0
Order By: Relevance
“…Our work continues these lines of research, the most similar setting to ours is the recent paper [GGZ18]. We summarize our contributions below:…”
Section: Related Work and Our Contributionssupporting
confidence: 53%
See 4 more Smart Citations
“…Our work continues these lines of research, the most similar setting to ours is the recent paper [GGZ18]. We summarize our contributions below:…”
Section: Related Work and Our Contributionssupporting
confidence: 53%
“…By introducing the auxiliary SDEs (13), ( 14), we are able to achieve the rate (δ 1/4 + λ 1/4 ), see Theorem 2.8 for the case p = 2. This upper bound is better in the number of iterations and hence, improves Lemma 10 of [GGZ18]. Our analysis for variance of the algorithm is also different.…”
Section: Related Work and Our Contributionsmentioning
confidence: 64%
See 3 more Smart Citations