2012
DOI: 10.1103/physreve.86.066211
|View full text |Cite
|
Sign up to set email alerts
|

Expanding the transfer entropy to identify information circuits in complex systems

Abstract: We propose a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by a large contribution to the expansion are associated to the informational circuits present in the system, with an informational character which can be associated to the sign of the contribution. For the sake of computational complexity, we adopt the assumption of Gaussianity and use the corresponding exact form… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
94
0
1

Year Published

2015
2015
2020
2020

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 74 publications
(98 citation statements)
references
References 24 publications
2
94
0
1
Order By: Relevance
“…Given their high specificity, their efficient implementation via traditional multivariate regression analysis, and their demonstrated link with neural autonomic regulation, the proposed quantities are suitable candidates for large scale applications to clinical databases recorded under uncontrolled conditions. Future studies should be directed to extend the decompositions to model-free frameworks that assess the role of nonlinear physiological dynamics in information storage, transfer and modification [5,31], to explore novel partial decomposition approaches that separate synergetic and redundant information rather than providing their net balance [3,16,17], and to explore scenarios with more than two source processes [15]. Practical extensions should be devoted to evaluate the importance of these measures for the assessment of cardiovascular and cardiorespiratory interactions in diseased conditions.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Given their high specificity, their efficient implementation via traditional multivariate regression analysis, and their demonstrated link with neural autonomic regulation, the proposed quantities are suitable candidates for large scale applications to clinical databases recorded under uncontrolled conditions. Future studies should be directed to extend the decompositions to model-free frameworks that assess the role of nonlinear physiological dynamics in information storage, transfer and modification [5,31], to explore novel partial decomposition approaches that separate synergetic and redundant information rather than providing their net balance [3,16,17], and to explore scenarios with more than two source processes [15]. Practical extensions should be devoted to evaluate the importance of these measures for the assessment of cardiovascular and cardiorespiratory interactions in diseased conditions.…”
Section: Discussionmentioning
confidence: 99%
“…These tools formalize different information-theoretic concepts applied to a "target" system in the observed dynamical network: the predictive information about the system describes the amount of information shared between its present state and the past history of the whole observed network [4,5]; the information storage indicates the information shared between the present and past states of the target [6,7]; the information transfer defines the information that a group of systems designed as "sources" provide about the present state of the target [8,9]; and the information modification reflects the redundant or synergetic interaction between multiple sources sending information to the target [3,10]. Operational definitions of these concepts have been proposed in recent years, which allow to quantify predictive information through measures of prediction entropy or full-predictability [11,12], information storage through the self-entropy or self-predictability [11,13], information transfer through transfer entropy or Granger causality [14], and information modification through entropy and prediction measures of net redundancy/synergy [11,15] or separate measures derived from partial information decomposition [16,17]. All these measures have been successfully applied in diverse fields of science ranging from cybernetics to econometrics, climatology, neuroscience and others [6,7,[18][19][20][21][22][23][24][25][26][27][28].…”
Section: Introductionmentioning
confidence: 99%
“…We also note that there are earlier applications of the concept of synergy (meant as synergistic mutual information) to neural data (e.g., [46][47][48][49]) that relied on the computation of interaction information. However, when interpreting these studies, it should be kept in mind that these report the difference between shared information and synergistic information-as detailed by Williams and Beer [10].…”
Section: Previous Studies Of Information Modification In Neural Datamentioning
confidence: 99%
“…Finally, we have introduced a pairwise index of synergy, which for each pair of variables measures how much they interact to provide better predictions of the target. Such index can be seen as the second cumulant in the expansion of the prediction error of the target variable, to be compared with the expansion of the transfer entropy in [21] which provides the interaction information as the second cumulant. The advantages provided by the present cumulant expansion are (i) conceptual problems found in the Gaussian case [24] are avoided, and (ii) the nonlinearity of PSI can be easily controlled by varying the kernel in the regression model.…”
Section: Discussionmentioning
confidence: 99%
“…UNNORMALIZED GRANGER CAUSALITY Interaction information is a classical measure of the amount of information (redundancy or synergy) bound up in a set of three variables [19], [20]. A generalization of the interaction information to the case of lagged interactions was addressed in [21]. It is important to emphasize that the sign of the interaction information corresponds to synergy or redundancy, and that this interpretation implies that synergy and redundancy are taken to be mutually exclusive qualities of the interactions between variables [22].…”
Section: Granger Causalitymentioning
confidence: 99%