2019
DOI: 10.3390/e21111081
|View full text |Cite
|
Sign up to set email alerts
|

The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design

Abstract: We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
59
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 21 publications
(61 citation statements)
references
References 54 publications
(133 reference statements)
0
59
0
Order By: Relevance
“…Estimating the relative entropy in Equation ( 13 ) usually requires a multidimensional integration that is often infeasible for most applied problems. However, employing Equation ( 13 ) and definition (5) from the recent findings in the paper [ 40 ], we avoid this multidimensional integration: …”
Section: Bayesian Inference With Information Theory For a Gaussianmentioning
confidence: 99%
See 3 more Smart Citations
“…Estimating the relative entropy in Equation ( 13 ) usually requires a multidimensional integration that is often infeasible for most applied problems. However, employing Equation ( 13 ) and definition (5) from the recent findings in the paper [ 40 ], we avoid this multidimensional integration: …”
Section: Bayesian Inference With Information Theory For a Gaussianmentioning
confidence: 99%
“…Therefore, employing Equation ( 15 ), information entropy can be directly estimated according to Equation ( A3 ) in the paper [ 40 ] using Monte Carlo sampling techniques on the GPE: …”
Section: Bayesian Inference With Information Theory For a Gaussianmentioning
confidence: 99%
See 2 more Smart Citations
“…Hence, analyzing the regional information entropy relationship can provide us with a decision basis for screening information. As a way of measuring the information content, information entropy S [ 34 ] is given by where x is the output of the system and is the probability distribution function (PDF) of x . There is a positive correlation between S and information content.…”
Section: The Regional Surrogate Model Technique Based On the Regiomentioning
confidence: 99%