2019
DOI: 10.48550/arxiv.1904.04551
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Robust Approximate Bayesian Inference with Synthetic Likelihood

Abstract: Bayesian synthetic likelihood (BSL) is now an established method for conducting approximate Bayesian inference in models where, due to the intractability of the likelihood function, exact Bayesian approaches are either infeasible or computationally too demanding. Similar to other approximate Bayesian methods, such as the method of approximate Bayesian computation, implicit in the application of BSL is the assumption that the data generating process (DGP) can produce simulated summary statistics that capture th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5

Relationship

4
1

Authors

Journals

citations
Cited by 7 publications
(15 citation statements)
references
References 16 publications
(37 reference statements)
0
15
0
Order By: Relevance
“…For example, An et al (2019a) develop a semi-parametric synthetic likelihood, which is more robust to departures from normality. Further, Frazier and Drovandi (2019) develop synthetic likelihood methods that are more robust when there is incompatibility between the model and observed summary statistic. An alternative extension could involve a method for automatically finding a whitening transformation that minimises the loss of accuracy compared to standard BSL.…”
Section: Discussionmentioning
confidence: 99%
“…For example, An et al (2019a) develop a semi-parametric synthetic likelihood, which is more robust to departures from normality. Further, Frazier and Drovandi (2019) develop synthetic likelihood methods that are more robust when there is incompatibility between the model and observed summary statistic. An alternative extension could involve a method for automatically finding a whitening transformation that minimises the loss of accuracy compared to standard BSL.…”
Section: Discussionmentioning
confidence: 99%
“…Recently, asymptotic properties of BSL have been derived under various assumptions. Frazier and Drovandi (2019) develop a result for posterior concentration and Nott et al (2019) show that the BSL posterior mean is consistent and asymptotically normally distributed.…”
Section: ∝ P(y|θ)p(θ)mentioning
confidence: 98%
“…In a future release we plan to incorporate the methods of Frazier and Drovandi (2019), which allows BSL to be much more computationally efficient when the model is misspecified, in particular when the model is unable to recover the observed value of the summary statistic. All of the methods in our BSL package use MCMC to explore the parameter space, and can thus be slow when model simulation is computationally intensive and there is a large number of parameters.…”
Section: Selecting the Penalty Parameter For Shrinkagementioning
confidence: 99%
“…An additional adjustment approach is considered that weights the individual summaries used in the analysis in such a way that if the simulated and observed summaries do not agree, the overall distance can still be made small. This new adjustment approach to ABC is inspired by the adjustment idea in Frazier and Drovandi (2019). In the context of Bayesian synthetic likelihood (BSL, Wood, 2010, Price et al, 2018, Frazier and Drovandi (2019) demonstrate that when the model is misspecified BSL can deliver misleading inference.…”
Section: Introductionmentioning
confidence: 99%
“…This new adjustment approach to ABC is inspired by the adjustment idea in Frazier and Drovandi (2019). In the context of Bayesian synthetic likelihood (BSL, Wood, 2010, Price et al, 2018, Frazier and Drovandi (2019) demonstrate that when the model is misspecified BSL can deliver misleading inference. To circumvent this issue, Frazier and Drovandi (2019) augment the BSL posterior with additional parameters that "soak up" the model misspecification.…”
Section: Introductionmentioning
confidence: 99%