2012
DOI: 10.1016/j.physleta.2011.10.066
|View full text |Cite
|
Sign up to set email alerts
|

A measure of statistical complexity based on predictive information with application to finite spin systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
49
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 40 publications
(53 citation statements)
references
References 23 publications
2
49
0
Order By: Relevance
“…A similar duality was noted by [5] in relation to the multi-information and the binding information (the extensive counterpart to the predictive information rate) in finite sets of discrete-valued random variables.…”
Section: Process Information Measures For Gaussian Processessupporting
confidence: 64%
See 1 more Smart Citation
“…A similar duality was noted by [5] in relation to the multi-information and the binding information (the extensive counterpart to the predictive information rate) in finite sets of discrete-valued random variables.…”
Section: Process Information Measures For Gaussian Processessupporting
confidence: 64%
“…In previous work [5] we examined several process information measures and their interrelationships, as well as generalisation of these for arbitrary countable sets of random variables. Following the conventions established there, we let ← X t = (.…”
Section: Introductionmentioning
confidence: 99%
“…dit implements the vast majority of information measure defined in the literature, including entropies (Shannon (Cover and Thomas 2006), Renyi, Tsallis), multivariate mutual informations (co-information (Bell 2003) (McGill 1954), total correlation (Watanabe 1960), dual total correlation(Te Sun 1980) (Han 1975) (Abdallah and Plumbley 2012), CAEKL mutual information (Chan et al 2015)), common informations (Gács-Körner(Gács and Körner 1973) (Tyagi, Narayan, and Gupta 2011), Wyner (Wyner 1975)(W. Liu, Xu, and Chen 2010), exact (Kumar, Li, and El Gamal 2014), functional, minimal sufficient statistic), and channel capacity (Cover and Thomas 2006). It includes methods of studying joint distributions including information diagrams, connected informations (Schneidman et al 2003) (Amari 2001), marginal utility of information (Allen, Stacey, and Bar-Yam 2017), and the complexity profile(Y.…”
mentioning
confidence: 99%
“…• Similarly, when marginal independence holds, we see that I P X|Y = 0 from (20). Otherwise stated, H P X|Y = H P X and H P Y|X = H P Y .…”
Section: The Aggregate and Split Channel Multivariate Balance Equationmentioning
confidence: 86%
“…An important point to realize is that the multivariate transmitted information between two different random vectors I P XY is the proper generalization for the usual mutual information MI P XY in the bivariate case, rather than the more complex alternatives used in multivariate sources (see Section 2.2 and [5,14]). Indeed properties (18) and (20) are crucial in transporting the structure and intuitions built from the bivariate channel entropy triangle to the multivariate one, of which the former is a proper instance. This was not the case with balance equations and entropy triangles for stochastic sources of information [5].…”
Section: Discussionmentioning
confidence: 99%