2019
DOI: 10.18637/jss.v088.c02
|View full text |Cite
|
Sign up to set email alerts
|

Getting Started with Particle Metropolis-Hastings for Inference in Nonlinear Dynamical Models

Abstract: This tutorial provides a gentle introduction to the particle Metropolis-Hastings (PMH) algorithm for parameter inference in nonlinear state-space models together with a software implementation in the statistical programming language R. We employ a step-by-step approach to develop an implementation of the PMH algorithm (and the particle filter within) together with the reader. This final implementation is also available as the package pmhtutorial from the Comprehensive R Archive Network (CRAN) repository. Throu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(24 citation statements)
references
References 63 publications
0
24
0
Order By: Relevance
“…In this section we will provide some background on particle filters, Markov chain Monte Carlo (MCMC) and related methods. For a more elaborate introduction, please refer to, e.g., Schön et al (2018); Dahlin and Schön (2016); Robert and Casella (2004). We will in particular discuss why models on the form (1) are problematic for most existing methods, and also introduce the notion of tempering.…”
Section: Background On Particle Filtering and Temperingmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section we will provide some background on particle filters, Markov chain Monte Carlo (MCMC) and related methods. For a more elaborate introduction, please refer to, e.g., Schön et al (2018); Dahlin and Schön (2016); Robert and Casella (2004). We will in particular discuss why models on the form (1) are problematic for most existing methods, and also introduce the notion of tempering.…”
Section: Background On Particle Filtering and Temperingmentioning
confidence: 99%
“…The settings of PMH can indeed be optimized by using more clever proposals than random walks (see, e.g., Dahlin and Schön (2016) for an overview) and methods for reducing the variance of z (such as adapted or bridging particle filter Del Moral and Murray (2015); Pitt and Shephard (1999); Del ). Such adaption would indeed push the performance further.…”
Section: A More Challenging Nonlinear Examplementioning
confidence: 99%
“…First, the chains’ trace plots are stationary around specific values, showing high quality and samples’ stability representing the posterior distribution [69] (Figs in S1 Fig, S2 Fig, S3 Fig, and S4 Fig). The second examines the auto-correlation as an essential indicator of the convergence [68]. We have minimal correlations between the samples and the previous ones (the correlation vanishes after the fourth lag).…”
Section: Resultsmentioning
confidence: 99%
“…The parallel estimation of the latent states and the parameters was a challenge. The combined evaluation will be like two loops: one outer loop for parameters’ estimation, and an inner loop for sequential Monte Carlo estimation of the latent states’ trajectory and the related likelihood functions based on the estimated parameters in the outer loop (see Algorithm 2) [68].…”
Section: Particle Independent Metropolis-hastingsmentioning
confidence: 99%
“…Joint static and dynamic parameter learning is achieved by embedding Metropolis-Hastings algorithm within Unscented Kalman Filtering framework. A general description of the framework is presented in 7 . The framework is fully Bayesian as the posterior probability of the static and dynamic parameters is obtained which is unique for the method, allowing for the measurement uncertainty quantification.…”
Section: Theoretical Frameworkmentioning
confidence: 99%