2016
DOI: 10.1287/mksc.2014.0901
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Rejection Sampling for Bayesian Hierarchical Models

Abstract: Bayesian hierarchical modeling is a popular approach to capturing unobserved heterogeneity across individual units. However, standard estimation methods such as Markov chain Monte Carlo (MCMC) can be impracticable for modeling outcomes from a large number of units. We develop a new method to sample from posterior distributions of Bayesian models, without using MCMC. Samples are independent, so they can be collected in parallel, and we do not need to be concerned with issues like chain convergence and autocorre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 41 publications
(49 reference statements)
0
11
0
Order By: Relevance
“…Here, the idea is to develop a (quadratic) approximation to the posterior distribution, the mode of which can be derived in closed form. Another method that promises to speed up the computations of MCMC is scalable rejection sampling (Braun and Damien 2015), which relies on tractable stochastic approximations to the posterior distribution (rather than deterministic approximations, as in variational inference). Taken together, these developments make MCMC estimation of hierarchical models on big data increasingly feasible.…”
Section: Volume Variety Velocity: Implications For Big Data Analyticsmentioning
confidence: 99%
“…Here, the idea is to develop a (quadratic) approximation to the posterior distribution, the mode of which can be derived in closed form. Another method that promises to speed up the computations of MCMC is scalable rejection sampling (Braun and Damien 2015), which relies on tractable stochastic approximations to the posterior distribution (rather than deterministic approximations, as in variational inference). Taken together, these developments make MCMC estimation of hierarchical models on big data increasingly feasible.…”
Section: Volume Variety Velocity: Implications For Big Data Analyticsmentioning
confidence: 99%
“…Barajas et al (2016) develop new methods that separate the target selection component and the campaign effect of online display ads for millions of users. Braun and Damien (2016) demonstrate how to scale rejection sampling for large hierarchical Bayes models.…”
mentioning
confidence: 99%
“…That is, each shard-level HMC algorithm would have a much smaller parameter space to explore. Braun and Damien's (2016) rejection sampler is a direct sampling algorithm that avoids the inherent convergence issues associated with MCMC. The authors propose running parallel instances of the algorithm across multiple cores of a machine, each core with the full data set in memory.…”
Section: Related Literaturementioning
confidence: 99%
“…However, for extremely large data sets this approach may overwhelm the limited amount of memory available. We propose combining the approaches of Braun and Damien (2016) and the first stage of the proposed algorithm: we avoid the burn-in period associated with an MCMC-based first stage, and we make more efficient use of a machine's limited amount of memory by partitioning the data into shards.…”
Section: Related Literaturementioning
confidence: 99%