2010
DOI: 10.1007/s11222-010-9178-z
|View full text |Cite
|
Sign up to set email alerts
|

Parallel multivariate slice sampling

Abstract: Slice sampling provides an easily implemented method for constructing a Markov chain Monte Carlo (MCMC) algorithm. However, slice sampling has two major drawbacks: (i) it requires repeated evaluation of likelihoods for each update, which can make it impractical when evaluations are expensive or as the number of evaluations grows (geometrically) with the dimension of the slice sampler, and (ii) since it can be challenging to construct multivariate updates, the updates are typically univariate, which often resul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
32
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(32 citation statements)
references
References 33 publications
0
32
0
Order By: Relevance
“…More recent work combines the use of multiple chains with adaptive MCMC in an attempt to use these multiple sources of information to learn an appropriate proposal distribution (12,13). Sometimes, specific MCMC algorithms are directly amenable to parallelization, such as independent Metropolis−Hastings (14) or slice sampling (15), as indeed are some statistical models via careful reparameterization (16) or implementation on specialist hardware, such as graphics processing units (GPUs) (17,18); however, these approaches are often problem specific and not generally applicable. For problems involving large amounts of data, parallelization may in some cases also be possible by partitioning the data and analyzing each subset using standard MCMC methods simultaneously on multiple machines (19).…”
mentioning
confidence: 99%
“…More recent work combines the use of multiple chains with adaptive MCMC in an attempt to use these multiple sources of information to learn an appropriate proposal distribution (12,13). Sometimes, specific MCMC algorithms are directly amenable to parallelization, such as independent Metropolis−Hastings (14) or slice sampling (15), as indeed are some statistical models via careful reparameterization (16) or implementation on specialist hardware, such as graphics processing units (GPUs) (17,18); however, these approaches are often problem specific and not generally applicable. For problems involving large amounts of data, parallelization may in some cases also be possible by partitioning the data and analyzing each subset using standard MCMC methods simultaneously on multiple machines (19).…”
mentioning
confidence: 99%
“…Nevertheless, parallel computing can be use to accelerate some calculations inside the MCMC steps, for example, those involved in likelihood evaluations (Tibbits et al, 2011 …”
Section: Discussionmentioning
confidence: 99%
“…According to the results, by using this method, one can generate approximately independent and identically distributed samples at a rate that is more efficient than other methods that update all dimensions at once. Additionally, Tibbits et al [40] propose an approach to multivariate slice sampler that naturally lends itself to a parallel implementation. They study approaches for constructing a multivariate slice sampler, and they show how parallel computing can be useful for making MCMC algorithms computationally efficient.…”
Section: Slice Samplermentioning
confidence: 99%
“…They study approaches for constructing a multivariate slice sampler, and they show how parallel computing can be useful for making MCMC algorithms computationally efficient. Tibbits et al [40] examine various implementations of their algorithm in the context of real and simulated data. Moreover, Kalli et al [41] present a more efficient version of the slice sampler for Dirichlet process mixture models described by Walker [37].…”
Section: Slice Samplermentioning
confidence: 99%