2016
DOI: 10.1073/pnas.1603583113
|View full text |Cite
|
Sign up to set email alerts
|

Unified framework for information integration based on information geometry

Abstract: Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each in… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
221
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
4

Relationship

3
5

Authors

Journals

citations
Cited by 123 publications
(235 citation statements)
references
References 46 publications
3
221
0
Order By: Relevance
“…Entropy quantifies only the instantaneous uncertainty of the neural states, understandable as equal-time interactions (Oizumi et al, 2016b; Fig. 2 A ).…”
Section: Discussionmentioning
confidence: 99%
“…Entropy quantifies only the instantaneous uncertainty of the neural states, understandable as equal-time interactions (Oizumi et al, 2016b; Fig. 2 A ).…”
Section: Discussionmentioning
confidence: 99%
“…Applying these IIT concepts as they are to a real human brain, which is composed of 10 11 neurons and 10 14 synaptic connections, is currently impossible for practical purposes. Thus, we need some gross approximations for these concepts when we empirically test explanations and predictions from IIT (Barrett & Seth, ; Chang et al, ; Lee, Mashour, Kim, Noh, & Choi, ; Oizumi, Amari, et al, ; Oizumi, Tsuchiya, & Amari, ; Tegmark, ). With approximations, our research group has computed patterns of integrated information from real neural activities recorded in awake human patients while they reported what they see in each trial in several tasks (Haun et al, ; Figure ).…”
Section: A Framework For Empirical Testing Of Iit Towards Understandimentioning
confidence: 99%
“…Applying these IIT concepts as they are to a real human brain, which is composed of 10 11 neurons and 10 14 synaptic connections, is currently impossible for practical purposes. Thus, we need some gross approximations for these concepts when we empirically test explanations and predictions from IIT (Barrett & Seth, 2011;Chang et al, 2012;Lee, Mashour, Kim, Noh, & Choi, 2009;Oizumi, Amari, et al, 2016a;Oizumi, Tsuchiya, & Amari, 2016b;Tegmark, 2016).…”
Section: Computing Integrated Information Patterns From Neural Actimentioning
confidence: 99%
“…The KL divergence is also called relative entropy as in information theory. As for the KL divergence, it is a good measure of difference with the desired mathematical properties [31]. Let q approximate the neighborhood p as q (y, θ) = p (y, θ + ∆θ), and the Taylor expansion gives an approximation of the KL divergence by…”
Section: Divergence and Distancementioning
confidence: 99%
“…It is evident that the effect of nonlinear measurement is reflected by the first terms of the Fisher metric tensor. Equation (31) establishes the relationship between the nonlinear measurement and the metric tensor.…”
Section: Natural Gradient Descent Filteringmentioning
confidence: 99%