2011
DOI: 10.1007/s11222-011-9271-y
|View full text |Cite
|
Sign up to set email alerts
|

An adaptive sequential Monte Carlo method for approximate Bayesian computation

Abstract: Approximate Bayesian computation (ABC) is a popular approach to address inference problems where the likelihood function is intractable, or expensive to calculate. To improve over Markov chain Monte Carlo (MCMC) implementations of ABC, the use of sequential Monte Carlo (SMC) methods has recently been suggested. Effective SMC algorithms that are currently available for ABC have a computational complexity that is quadratic in the number of Monte Carlo samples [4,17,19,21] and require the careful choice of simula… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

7
507
0
3

Year Published

2012
2012
2017
2017

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 432 publications
(523 citation statements)
references
References 25 publications
7
507
0
3
Order By: Relevance
“…We also filter the observable variable log RV5, using our procedure and the HAR-RV-J method, using as conditioning variables those of equation 25. This serves as a check on the reliability of the parametric model that is used to generate the long simulation which underlies the filtering and smoothing results, as the HAR-RV-J method does not rely on the parametric model.…”
Section: Monte Carlo Results For Filtering and Smoothingmentioning
confidence: 99%
“…We also filter the observable variable log RV5, using our procedure and the HAR-RV-J method, using as conditioning variables those of equation 25. This serves as a check on the reliability of the parametric model that is used to generate the long simulation which underlies the filtering and smoothing results, as the HAR-RV-J method does not rely on the parametric model.…”
Section: Monte Carlo Results For Filtering and Smoothingmentioning
confidence: 99%
“…t represents the importance of the associated sample x (i) t [23]. The samples x (i) t at time t are generated from the importance distribution η t , which is constructed by a Markov transition kernel K and the previous distribution π t−1 [24]; each sample x (i) t evolves from x (i) t−1 according to the kernel K, and the importance distribution η t is represented by…”
Section: Sequential Monte Carlo For Graph Matchingmentioning
confidence: 99%
“…We therefore refer to this as the ABC-PMC algorithm. There is also a wider family of related ABC-SMC algorithms, including Sisson et al (2007) and Del Moral et al (2012), which update their particles in more complex ways based on MCMC moves, as described in Del Moral et al (2006). (c) Simulate dataset D * from the model using parameters θ * .…”
Section: Abc-pmcmentioning
confidence: 99%