2014
DOI: 10.1080/03610918.2013.777455
|View full text |Cite
|
Sign up to set email alerts
|

Understanding the Hastings Algorithm

Abstract: The Hastings algorithm is a key tool in computational science. While mathematically justified by detailed balance, it can be conceptually difficult to grasp. Here, we present two complementary and intuitive ways to derive and understand the algorithm. In our framework, it is straightforward to see that the celebrated Metropolis-Hastings algorithm has the highest acceptance probability of all Hastings algorithms.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 24 publications
(10 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…Parameters were updated by sequential Gibbs sampling. In sequential Gibbs sampling, one parameter is updated at a time using the Metropolis-Hastings algorithm [ 40 , 41 ]: For each single parameter, a proposal is drawn from a normal distribution centered at the current value, and a scale of unity for Δ H , Δ G , and Δ H 0 , or the initial guess value for σ , [ L ] s , and [ R ] 0 ; The trial move is accepted or rejected according to the Metropolis criterion. If it is accepted, the next value in the Markov chain is the trial move.…”
Section: Methodsmentioning
confidence: 99%
“…Parameters were updated by sequential Gibbs sampling. In sequential Gibbs sampling, one parameter is updated at a time using the Metropolis-Hastings algorithm [ 40 , 41 ]: For each single parameter, a proposal is drawn from a normal distribution centered at the current value, and a scale of unity for Δ H , Δ G , and Δ H 0 , or the initial guess value for σ , [ L ] s , and [ R ] 0 ; The trial move is accepted or rejected according to the Metropolis criterion. If it is accepted, the next value in the Markov chain is the trial move.…”
Section: Methodsmentioning
confidence: 99%
“…A well-known class of methods is based on Markov chain Monte Carlo (MCMC), where the aim is to define an iterative process whose stationary distribution coincides with the target distribution, which in Bayesian inversion is the posterior. MCMC techniques come in many variants, and one common variant is MCMC sampling with Metropolis-Hastings dynamics (Minh and Le Minh 2015), which generates a Markov chain with equilibrium distribution that coincides with the posterior in the limit. Other variants use Gibbs sampling, which reduces the autocorrelation between samples.…”
Section: Computational Feasibilitymentioning
confidence: 99%
“…We perform the analysis with the software BAT v1.1.0-DEV [21], which internally uses CUBA [31] v4.2 for the integration of multi-dimensional probabilities and the Metropolis-Hastings algorithm [32] for the fit. The computation time depends on the number of samples drawn from the considered probability distribution.…”
Section: Fit Proceduresmentioning
confidence: 99%