2022
DOI: 10.48550/arxiv.2205.15784
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Likelihood-Free Inference with Generative Neural Networks via Scoring Rule Minimization

Abstract: Bayesian Likelihood-Free Inference methods yield posterior approximations for simulator models with intractable likelihood. Recently, many works trained neural networks to approximate either the intractable likelihood or the posterior directly. Most proposals use normalizing flows, namely neural networks parametrizing invertible maps used to transform samples from an underlying base measure; the probability density of the transformed samples is then accessible and the normalizing flow can be trained via maximu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…It is important to note that the computation of the maximum likelihood estimator of the kernel parameters is infeasible as we do not have knowledge of f * i,s (y * i,s ) and F * i,s (y * i,s ) for y * i,s ≤ 0, ∀i ∈ I. As a result, we propose an innovative estimation methodology based on the minimum Scoring Rules estimator [Dawid et al, 2016, Pacchiardi andDutta, 2022b], which requires only simulations from the censored latent Gaussian copula.…”
Section: Statistical Problemmentioning
confidence: 99%
“…It is important to note that the computation of the maximum likelihood estimator of the kernel parameters is infeasible as we do not have knowledge of f * i,s (y * i,s ) and F * i,s (y * i,s ) for y * i,s ≤ 0, ∀i ∈ I. As a result, we propose an innovative estimation methodology based on the minimum Scoring Rules estimator [Dawid et al, 2016, Pacchiardi andDutta, 2022b], which requires only simulations from the censored latent Gaussian copula.…”
Section: Statistical Problemmentioning
confidence: 99%
“…Osokin et al [38] used classifier two samples to evaluate their GAN model, which was designed to produce synthetic medical images. Liu et al [39] and Pacchiardi et al [40] used C2ST for anomaly detection in videos. Two samples are selected from distributions G and R, which represent the generated and reference distributions, respectively.…”
Section: Classifier Two-sample Testmentioning
confidence: 99%