2007
DOI: 10.1007/s11222-007-9022-2
|View full text |Cite
|
Sign up to set email alerts
|

Parallelizing MCMC for Bayesian spatiotemporal geostatistical models

Abstract: When MCMC methods for Bayesian spatiotemporal modeling are applied to large geostatistical problems, challenges arise as a consequence of memory requirements, computing costs, and convergence monitoring. This article describes the parallelization of a reparametrized and marginalized posterior sampling (RAMPS) algorithm, which is carefully designed to generate posterior samples efficiently. The algorithm is implemented using the Parallel Linear Algebra Package (PLAPACK). The scalability of the algorithm is inve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

1
27
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 44 publications
(28 citation statements)
references
References 25 publications
1
27
0
Order By: Relevance
“…Blackford et al, 1996) using OpenMP or MPI. Yan et al (2007) demonstrate that PLAPACK based block matrix algorithms provide similar factors of improvement to the OpenMP results described in this article. As the focus of this paper is primarily on software which runs on a single physical machine (with a few processors / cores), we have not run simulations which use MPI.…”
supporting
confidence: 64%
See 2 more Smart Citations
“…Blackford et al, 1996) using OpenMP or MPI. Yan et al (2007) demonstrate that PLAPACK based block matrix algorithms provide similar factors of improvement to the OpenMP results described in this article. As the focus of this paper is primarily on software which runs on a single physical machine (with a few processors / cores), we have not run simulations which use MPI.…”
supporting
confidence: 64%
“…The slice sampler (Damien et al, 1999;Mira and Tierney, 2002;Neal, 1997Neal, , 2003a has been proposed as an easily implemented method for constructing an MCMC algorithm and can, in many circumstances, result in samplers with good mixing properties. Slice sampling has been used in many contexts, for example in spatial models (Agarwal and Gelfand, 2005;Yan et al, 2007), in biological models (Lewis et al, 2005;Shahbaba and Neal, 2006;Sun et al, 2007), variable selection (Kinney and Dunson, 2007;Nott and Leonte, 2004), and machine learning (Andrieu et al, 2003;Kovac, 2005;Mackay, 2002). Slice samplers can adapt to local characteristics of the distribution, which can make them easier to tune than Metropolis-Hastings approaches.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…More recent work combines the use of multiple chains with adaptive MCMC in an attempt to use these multiple sources of information to learn an appropriate proposal distribution (12,13). Sometimes, specific MCMC algorithms are directly amenable to parallelization, such as independent Metropolis−Hastings (14) or slice sampling (15), as indeed are some statistical models via careful reparameterization (16) or implementation on specialist hardware, such as graphics processing units (GPUs) (17,18); however, these approaches are often problem specific and not generally applicable. For problems involving large amounts of data, parallelization may in some cases also be possible by partitioning the data and analyzing each subset using standard MCMC methods simultaneously on multiple machines (19).…”
mentioning
confidence: 99%
“…They are widely used in statistical applications, ranging from machine learning [1], [2], [3], statistical physics [4], [5] and geostatistics [6] to medical imaging [7], genetics [8], phylogenetics [9], computational biology [10], [11] and stochastic optimization [12], [13]. Sampling from a distribution is fundamental in these applications because we are typically interested in performing the following task:…”
Section: Introductionmentioning
confidence: 99%