2022
DOI: 10.1007/s11222-022-10159-2
|View full text |Cite
|
Sign up to set email alerts
|

Approximate Laplace importance sampling for the estimation of expected Shannon information gain in high-dimensional Bayesian design for nonlinear models

Abstract: One of the major challenges in Bayesian optimal design is to approximate the expected utility function in an accurate and computationally efficient manner. We focus on Shannon information gain, one of the most widely used utilities when the experimental goal is parameter inference. We compare the performance of various methods for approximating expected Shannon information gain in common nonlinear models from the statistics literature, with a particular emphasis on Laplace importance sampling (LIS) and approxi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 22 publications
(27 reference statements)
0
3
0
Order By: Relevance
“…Therefore they should introduce a similar bias as incurred when using a well-trained MDN predicting one Gaussian with full covariance matrix. The same goes for the consistent extensions of both methods, the VNMC method of Foster et al (2019a) and the Laplace-based importance sampling estimator of Carlon et al (2020); Englezou et al (2022). Unlike the variational methods, the Laplace-based methods are inherently restricted to a Gaussian posterior pdf.…”
Section: Discussionmentioning
confidence: 90%
See 1 more Smart Citation
“…Therefore they should introduce a similar bias as incurred when using a well-trained MDN predicting one Gaussian with full covariance matrix. The same goes for the consistent extensions of both methods, the VNMC method of Foster et al (2019a) and the Laplace-based importance sampling estimator of Carlon et al (2020); Englezou et al (2022). Unlike the variational methods, the Laplace-based methods are inherently restricted to a Gaussian posterior pdf.…”
Section: Discussionmentioning
confidence: 90%
“…While functional approximations introduce additional complexity compared to straightforward double-loop Monte Carlo estimators such as the NMC method, they have significant advantages, especially since the NMC method with reused samples, while working well in the presented examples, can perform suboptimal when compared to methods using functional approximations (Englezou et al, 2022). Most importantly, they allow the design of experiments best suited to answer any scientific or applied question, provided a mapping from model space to the relevant target space can be defined.…”
Section: Discussionmentioning
confidence: 99%
“…The first step in building a decision tree is to determine the best attribute to split the data. This is typically done by using a measure of impurity like the Gini index [26,27] or information gain [27,28]. For example, information gain can be calculated as follows:…”
mentioning
confidence: 99%