Monte Carlo and Quasi-Monte Carlo Methods 2008 2009
DOI: 10.1007/978-3-642-04107-5_9
|View full text |Cite
|
Sign up to set email alerts
|

Markov Chain Monte Carlo Algorithms: Theory and Practice

Abstract: Summary. We describe the importance and widespread use of Markov chain Monte Carlo (MCMC) algorithms, with an emphasis on the ways in which theoretical analysis can help with their practical implementation. In particular, we discuss how to achieve rigorous quantitative bounds on convergence to stationarity using the coupling method together with drift and minorisation conditions. We also discuss recent advances in the field of adaptive MCMC, where the computer iteratively selects from among many different MCMC… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 35 publications
(42 reference statements)
0
4
0
Order By: Relevance
“…It can be easily coded and is statistically efficient without a strenuous increase in the computational expense. Roberts and Rosenthal (2007) and Rosenthal (2008) demonstrated that an algorithm can learn and approach an optimal algorithm via automatic tuning scaling parameters to optimal values. After a sufficiently long adaptation period it will converge much faster than a non-adapted algorithm.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…It can be easily coded and is statistically efficient without a strenuous increase in the computational expense. Roberts and Rosenthal (2007) and Rosenthal (2008) demonstrated that an algorithm can learn and approach an optimal algorithm via automatic tuning scaling parameters to optimal values. After a sufficiently long adaptation period it will converge much faster than a non-adapted algorithm.…”
Section: Discussionmentioning
confidence: 99%
“…After a sufficiently long adaptation period it will converge much faster than a non-adapted algorithm. However this fast convergence feature may increase the risk that chains may converge to the wrong values (Robert and Casella, 2004;Rosenthal, 2008). Moreover, since proposals are accepted using the history of the chain the sampler is no longer Markovian and standard convergence techniques cannot be used.…”
Section: Discussionmentioning
confidence: 99%
“…We used Parallel Computation and Task Scheduling Embarrassingly parallel algorithms can easily improve execution time of a single task while implementing them correctly. Monte Carlo Simulations [47] and Mandelbrot Sets (also known as Fractals) [48] are examples for embarrassingly parallel algorithms. The ideal case of embarrassingly parallel algorithms [49] is that all subproblems/tasks are defined before the computations begin.…”
Section: Hyperparameter Optimization Using Parallel Processing With S...mentioning
confidence: 99%
“…Specifically, we use Random Walk Metropolis (RWM) within Gibbs sampling for w (Craiu and Rosenthal, 2014;Rosenthal, 2009;Andrieu et al, 2003) while for f we will use the elliptical slice sampling (Murray et al, 2010) that has been designed specifically for GP-based models and does not require tuning of free parameters.…”
Section: Gp-sim For Conditional Copulamentioning
confidence: 99%