2021
DOI: 10.48550/arxiv.2102.06521
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Robust and integrative Bayesian neural networks for likelihood-free parameter inference

Abstract: State-of-the-art neural network-based methods for learning summary statistics have delivered promising results for simulation-based likelihoodfree parameter inference. Existing approaches require density estimation as a post-processing step building upon deterministic neural networks, and do not take network prediction uncertainty into account. This work proposes a robust integrated approach that learns summary statistics using Bayesian neural networks, and directly estimates the posterior density using catego… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 21 publications
(30 reference statements)
0
2
0
Order By: Relevance
“…For many dynamical models, the simulation output is high-dimensional, and the summary statistics are used as a dimension reduction technique for faster training (Sisson et al, 2018; Wood, 2010; Wrede et al, 2021). Importantly, reducing the high-dimensional data to low-dimensional summary statistics makes inference possible, where the likelihood is intractable.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…For many dynamical models, the simulation output is high-dimensional, and the summary statistics are used as a dimension reduction technique for faster training (Sisson et al, 2018; Wood, 2010; Wrede et al, 2021). Importantly, reducing the high-dimensional data to low-dimensional summary statistics makes inference possible, where the likelihood is intractable.…”
Section: Methodsmentioning
confidence: 99%
“…The core of the methodology only requires forward simulations from the computer programming of a parametric stochastic simulator (also referred to as generative model), rather than model-specific analytic calculation or exact evaluation of likelihood function (Beaumont, 2010;Lueckmann et al, 2021;Papamakarios et al, 2019a).SBI is a method for diverse scientific applications where (i) a forward model (simulator) is available, (ii) the likelihood is intractable, and (iii) an accurate approximation with the right amount of uncertainty is important to achieve. In practice, the traditional approximate Bayesian computation (ABC) methods (Beaumont et al, 2002;Sunnaker et al, 2013;Sisson et al, 2018) for posterior estimation suffer from the curse of dimensionality and their performance depends critically on the tolerance level in the accepted/rejected parameter setting (Cranmer et al, 2020;Wrede et al, 2021). An alternative approach is to utilize ANNs to either estimate the posterior directly, bypassing the need for MCMC (Papamakarios and Murray, 2016;Lueckmann et al, 2017;Greenberg et al, 2019), or use synthesized likelihoods or density ratios which require MCMC sampling or training classifiers to extract information from the posterior (Papamakarios et al, 2019b;Hermans et al, 2020;Durkan et al, 2020).…”
Section: Introductionmentioning
confidence: 99%