2020
DOI: 10.48550/arxiv.2006.08993
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…deep nonlinear reward functions in IRL), approximates methods may not be able to scale with high dimensional parameter spaces. MCMC sampling methods are shown to be slow in convergence [7,29] and variational inference algorithms suffer from restrictions in the distribution family of the observable data, as well as various truncation assumptions for the variational distribution to yield a finite dimensional representation [11,23]. These limitations apparently make approximate Bayesian inference methods inapplicable for DPM models with deep neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…deep nonlinear reward functions in IRL), approximates methods may not be able to scale with high dimensional parameter spaces. MCMC sampling methods are shown to be slow in convergence [7,29] and variational inference algorithms suffer from restrictions in the distribution family of the observable data, as well as various truncation assumptions for the variational distribution to yield a finite dimensional representation [11,23]. These limitations apparently make approximate Bayesian inference methods inapplicable for DPM models with deep neural networks.…”
Section: Introductionmentioning
confidence: 99%