2014
DOI: 10.1073/pnas.1408184111
|View full text |Cite
|
Sign up to set email alerts
|

A general construction for parallelizing Metropolis−Hastings algorithms

Abstract: Markov chain Monte Carlo methods (MCMC) are essential tools for solving many modern-day statistical and computational problems; however, a major limitation is the inherently sequential nature of these algorithms. In this paper, we propose a natural generalization of the Metropolis−Hastings algorithm that allows for parallelizing a single chain using existing MCMC methods. We do so by proposing multiple points in parallel, then constructing and sampling from a finite-state Markov chain on the proposed points su… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
130
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 117 publications
(139 citation statements)
references
References 35 publications
1
130
0
Order By: Relevance
“…. , θ (N ) (a similar approach was proposed in [20]). Each resampled candidate is then tested as possible 180 future state of one chain.…”
Section: Reusing Candidates In Parallel I-mtm Chainsmentioning
confidence: 99%
See 1 more Smart Citation
“…. , θ (N ) (a similar approach was proposed in [20]). Each resampled candidate is then tested as possible 180 future state of one chain.…”
Section: Reusing Candidates In Parallel I-mtm Chainsmentioning
confidence: 99%
“…In this work, we provide an exhaustive review of more sophisticated MCMC methods that, at each iteration, consider different candidates as possible new state of the chain. More specifically, at each iteration different samples are compared by certain weights and then one of them is 20 selected as possible future state. The main advantage of these algorithms is that they foster the exploration of a larger portion of the sample space, decreasing the correlation among the states of the generated chain.…”
Section: Introductionmentioning
confidence: 99%
“…General strategies for parallel MCMC such as multiple-proposal MH algorithm (Calderhead, 2014) and population MCMC (Song, Wu and Liang, 2014) mostly require full data at each node.…”
Section: Methodsmentioning
confidence: 99%
“…where α = α N + b n and β = β N −1 N + 1 − b n . Once the weights q n or all biomolecules are updated from the above procedure, we update the loads b n by sampling from the corresponding conditional distribution P ({b n } n |D, µ part , {q n , x n , y n , z n } n , ∆t) using a Methropolis-Hasting algorithm [17,21]. For this, we use a proposal distribution with the form b n new ∼ Bernoulli(q n ).…”
Section: Sampling Biomolecule Loadsmentioning
confidence: 99%