2021
DOI: 10.1214/20-ba1200
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Gaussian Process Surrogate Bayesian Inference with Noisy Likelihood Evaluations

Abstract: We consider Bayesian inference when only a limited number of noisy log-likelihood evaluations can be obtained. This occurs for example when complex simulator-based statistical models are fitted to data, and synthetic likelihood (SL) method is used to form the noisy log-likelihood estimates using computationally costly forward simulations. We frame the inference task as a sequential Bayesian experimental design problem, where the log-likelihood function is modelled with a hierarchical Gaussian process (GP) surr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
57
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 26 publications
(58 citation statements)
references
References 43 publications
1
57
0
Order By: Relevance
“…Since standard ABC is notoriously slow and requires tuning of some hyperparameters, there has been considerable research in making likelihood-free inference more efficient, using, for example, ideas from Bayesian optimisation and experimental design (Gutmann and Corander, 2016;Järvenpää et al, 2019Järvenpää et al, , 2020, conditional density estimation (Papamakarios and Murray, 2016;Lueckmann et al, 2017;Greenberg et al, 2019), classification (Gutmann et al, 2018), indirect inference (Drovandi et al, 2015), optimisation (Meeds and Welling, 2015;Ikonomov and Gutmann, 2020), and more broadly surrogate modelling with Gaussian processes (Wilkinson, 2014;Meeds and Welling, 2015) and neural networks (Blum and Francois, 2010;Chen and Gutmann, 2019;Papamakarios et al, 2019).…”
Section: Likelihood-free Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…Since standard ABC is notoriously slow and requires tuning of some hyperparameters, there has been considerable research in making likelihood-free inference more efficient, using, for example, ideas from Bayesian optimisation and experimental design (Gutmann and Corander, 2016;Järvenpää et al, 2019Järvenpää et al, , 2020, conditional density estimation (Papamakarios and Murray, 2016;Lueckmann et al, 2017;Greenberg et al, 2019), classification (Gutmann et al, 2018), indirect inference (Drovandi et al, 2015), optimisation (Meeds and Welling, 2015;Ikonomov and Gutmann, 2020), and more broadly surrogate modelling with Gaussian processes (Wilkinson, 2014;Meeds and Welling, 2015) and neural networks (Blum and Francois, 2010;Chen and Gutmann, 2019;Papamakarios et al, 2019).…”
Section: Likelihood-free Inferencementioning
confidence: 99%
“…Likelihood-free inference has gained much traction recently, with many methods leveraging advances in machinelearning (e.g. Gutmann and Corander, 2016;Lueckmann et al, 2017;Järvenpää et al, 2018;Chen and Gutmann, 2019;Papamakarios et al, 2019;Thomas et al, 2020;Järvenpää et al, 2021). Ultimately, however, the quality of the statistical inference within a scientific downstream task depends on the data that are available in the first place.…”
Section: Introductionmentioning
confidence: 99%
“…Besides deep learning based approaches, another major machine learning inspired branch of the ABC literature, concerns log-likelihood and posterior approximations via Gaussian Process Surrogates (GPSs) (Meeds and Welling, 2014;Järvenpää et al, 2018Järvenpää et al, , 2021Acerbi, 2020). A major benefit of GPSs lies in the ability for clever training data selection via active learning, since such GPSs allow uncertainty quantification out of the box, which in turn can be utilized for the purpose of targeting high uncertainty regions in parameter space.…”
mentioning
confidence: 99%
“…Illustrations of BOLFI are in Sections 6.1 and 6.2. A more recent contribution, exploiting GPs to predict a log-SL, is in Järvenpää et al (2020).…”
Section: Algorithmic Initialization Using Bolfi and Elfimentioning
confidence: 99%