2020
DOI: 10.3390/e22020216
|View full text |Cite
|
Sign up to set email alerts
|

Generalised Measures of Multivariate Information Content

Abstract: The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate information content that can be accurately depicted using Venn diagrams for any number of random variables. These measures complement the existing measures of multivariate mutual information and are constructed by considering the algebraic structure of information s… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(29 citation statements)
references
References 77 publications
1
21
0
Order By: Relevance
“…Given the tight relation between information dynamics and the VAR representation of Gaussian stochastic processes, future works can be envisaged to introduce ANNs for the estimation of measures of information dynamics different than the GC ( Faes et al, 2017b ; Finn & Lizier, 2020 ), computed even across multiple time scales ( Faes, Marinazzo & Stramaglia, 2017 ; Martins et al, 2020 ). Moreover, this new method will easily find application even in different contexts, such as the study of dynamic information flow between stock market indices ( Scagliarini et al, 2020 ), between different brain regions with Granger-based estimators ( Astolfi et al, 2007 ), for time series analysis in climatology ( Faes et al, 2017a ), or for the study of gene regulatory networks ( Davidson & Levin, 2005 ).…”
Section: Conclusion and Limitationsmentioning
confidence: 99%
“…Given the tight relation between information dynamics and the VAR representation of Gaussian stochastic processes, future works can be envisaged to introduce ANNs for the estimation of measures of information dynamics different than the GC ( Faes et al, 2017b ; Finn & Lizier, 2020 ), computed even across multiple time scales ( Faes, Marinazzo & Stramaglia, 2017 ; Martins et al, 2020 ). Moreover, this new method will easily find application even in different contexts, such as the study of dynamic information flow between stock market indices ( Scagliarini et al, 2020 ), between different brain regions with Granger-based estimators ( Astolfi et al, 2007 ), for time series analysis in climatology ( Faes et al, 2017a ), or for the study of gene regulatory networks ( Davidson & Levin, 2005 ).…”
Section: Conclusion and Limitationsmentioning
confidence: 99%
“…net increases in redundancy from decreases in synergy. Future studies could perform more detailed analyses by employing partial information decomposition (PID) measures [74][75][76][77][78][79][80][81].…”
Section: Limitations and Future Workmentioning
confidence: 99%
“…Note that the former measures do not set aside a special variable Y , meaning the information they quantify is conceptually different from the PID. This dichotomy is ever-present in the information decomposition literature, as illustrated by work on entropy decompositions [ 19 , 20 , 21 ] that do not specify Y . For this reason, we refer to the PID as a directed measure of MMI (as it is only defined with respect to a target variable), and the former approaches as undirected measures.…”
Section: MMI and Information Decompositionmentioning
confidence: 99%