2021
DOI: 10.48550/arxiv.2103.04177
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Metropolis-Hastings via Classification

Abstract: This paper develops a Bayesian computational platform at the interface between posterior sampling and optimization in models whose marginal likelihoods are difficult to evaluate. Inspired by adversarial optimization, namely Generative Adversarial Networks (GAN) [18], we reframe the likelihood function estimation problem as a classification problem. Pitting a Generator, who simulates fake data, against a Classifier, who tries to distinguish them from the real data, one obtains likelihood (ratio) estimators whic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 32 publications
0
2
0
Order By: Relevance
“…From Lemma 4.1, we have θ * w = θ 0 for any w. Thus, from Assumption 2, 3 and Theorem 11.1 in Kaji and Rockova (2021), we have for every sequence of constants M n → ∞, as n → ∞,…”
Section: A Proofsmentioning
confidence: 86%
See 1 more Smart Citation
“…From Lemma 4.1, we have θ * w = θ 0 for any w. Thus, from Assumption 2, 3 and Theorem 11.1 in Kaji and Rockova (2021), we have for every sequence of constants M n → ∞, as n → ∞,…”
Section: A Proofsmentioning
confidence: 86%
“…Following Kaji and Rockova (2021), for any fixed w, π w (θ | X (n) ) can be viewed as the posterior density under a mis-specified likelihood…”
Section: Theorymentioning
confidence: 99%