2021
DOI: 10.3390/e23010079
|View full text |Cite
|
Sign up to set email alerts
|

Discovering Higher-Order Interactions Through Neural Information Decomposition

Abstract: If regularity in data takes the form of higher-order functions among groups of variables, models which are biased towards lower-order functions may easily mistake the data for noise. To distinguish whether this is the case, one must be able to quantify the contribution of different orders of dependence to the total information. Recent work in information theory attempts to do this through measures of multivariate mutual information (MMI) and information decomposition (ID). Despite substantial theoretical progr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…Recently, using information-theoretic quantities such as mutual information and transfer entropy has become more attractive in analyzing neuroimaging data [25]. One application of these quantities is to find the interaction between neurons [26,27] or brain regions [28]. For instance, a recent paper has proposed using a new measure, interaction information, to estimate the FC using the mutual information.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, using information-theoretic quantities such as mutual information and transfer entropy has become more attractive in analyzing neuroimaging data [25]. One application of these quantities is to find the interaction between neurons [26,27] or brain regions [28]. For instance, a recent paper has proposed using a new measure, interaction information, to estimate the FC using the mutual information.…”
Section: Introductionmentioning
confidence: 99%