2020
DOI: 10.3390/e22010108
|View full text |Cite
|
Sign up to set email alerts
|

Generalizing Information to the Evolution of Rational Belief

Abstract: Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome's plausibility. Information measures based on Shannon's concept of entropy include realization information, Kullback-Leibler divergence, Lindley's information in experiment, cross entropy, and mutual information.We derive a general theory of information from first principles that accounts for evolving belief and recovers all of… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 43 publications
0
5
0
Order By: Relevance
“…With the same diagonal structure of P t we considered the general family of f-divergence to measure the difference between p and q. In [7], the notions of information The scaling is wrt to the standard deviation from the covariance matrix. We see some state are not converging because their error relative to the standard deviation is growing and approaches 60σ by the end of the simulation.…”
Section: Discrete-time VI Filtermentioning
confidence: 99%
“…With the same diagonal structure of P t we considered the general family of f-divergence to measure the difference between p and q. In [7], the notions of information The scaling is wrt to the standard deviation from the covariance matrix. We see some state are not converging because their error relative to the standard deviation is growing and approaches 60σ by the end of the simulation.…”
Section: Discrete-time VI Filtermentioning
confidence: 99%
“…We show how our theory of information (Duersch and Catanach, 2020), Theorem 1, allows us to derive a training objective from the information that is created when we select a prior representation, observe the training data, and either infer the posterior distribution or construct a variational approximation of it. Zhang et al (2018) provide a thorough survey of recent work on variational inference.…”
Section: Our Contributionsmentioning
confidence: 99%
“…Information-theoretic formulations of complexity can be traced back to Shannon (1948) and the concept of entropy as a measure of the uncertainty associated with sequences of discrete symbols that may be transmitted over a communication channel. In order to rigorously understand how information in our datasets relates to Bayesian inference and encoding complexity, we developed a theory of information (Duersch and Catanach, 2020) rooted in understanding information as an expectation over rational belief.…”
Section: Controlling Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…If M m l no longer provides meaningful information at the next level, we use the next highest fidelity model in the algorithm, M m l +1 . This criteria can be formulated using a generalization of information theory [49], where the information gained about the full posterior, p (θ | D, M K ), by moving from level l to l + 1 with model M m l is:…”
Section: Information-theoretic Criteria For Model Fidelity Adaptationmentioning
confidence: 99%