2017
DOI: 10.3390/e19060273
|View full text |Cite
|
Sign up to set email alerts
|

Multiscale Information Theory and the Marginal Utility of Information

Abstract: Complex systems display behavior at a range of scales. Large-scale behaviors can emerge from the correlated or dependent behavior of individual small-scale components. To capture this observation in a rigorous and general way, we introduce a formalism for multiscale information theory. Dependent behavior among system components results in overlapping or shared information. A system's structure is revealed in the sharing of information across the system's dependencies, each of which has an associated scale. Cou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

4
44
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(48 citation statements)
references
References 95 publications
4
44
0
Order By: Relevance
“…Here, since the I-diagrams are identical, so are the complexity profiles. The second profile is the marginal utility of information [56], which is a derivative of a linear programming problem whose constraints are given by the I-diagram, so here, again, they are identical. Finally, we have Schneidman et al's connected information [70], which comprise the differences in entropies of the maximum entropy distributions whose k-and k − 1-way marginals are fixed to match those of the distribution of interest.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Here, since the I-diagrams are identical, so are the complexity profiles. The second profile is the marginal utility of information [56], which is a derivative of a linear programming problem whose constraints are given by the I-diagram, so here, again, they are identical. Finally, we have Schneidman et al's connected information [70], which comprise the differences in entropies of the maximum entropy distributions whose k-and k − 1-way marginals are fixed to match those of the distribution of interest.…”
Section: Methodsmentioning
confidence: 99%
“…There are measures [20,22,44,[47][48][49][50][51][52][53] and expansions [54][55][56] purporting to measure or otherwise extract the complexity, magnitude or structure of dependencies within a multivariate distribution. Many of these techniques, including those just cited, are sums and differences of atoms in these information diagrams.…”
Section: Developmentmentioning
confidence: 99%
“…It includes methods of studying joint distributions including information diagrams, connected informations (Schneidman et al 2003) (Amari 2001), marginal utility of information (Allen, Stacey, and Bar-Yam 2017), and the complexity profile(Y. Bar-Yam 2004).…”
mentioning
confidence: 99%
“…The issue begins with three papers which deal with the foundational aspects of information processing in complex systems [3][4][5]. The study of Allen et al [3] describes two quantitative indices that summarize the structure of a complex system: (i) its complexity profile, based on the multivariate mutual information at a given scale or higher, and (ii) the marginal utility of information, characterizing the extent to which a system can be described using limited amounts of information.…”
mentioning
confidence: 99%
“…The study of Allen et al [3] describes two quantitative indices that summarize the structure of a complex system: (i) its complexity profile, based on the multivariate mutual information at a given scale or higher, and (ii) the marginal utility of information, characterizing the extent to which a system can be described using limited amounts of information. Information is understood to have a scale equal to the multiplicity (or redundancy) at which it arises, and so the analysis shows how these indices capture the multi-scale structure of complex systems.…”
mentioning
confidence: 99%