2016
DOI: 10.1002/hbm.23471
|View full text |Cite
|
Sign up to set email alerts
|

A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula

Abstract: We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate sta… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
324
1
1

Year Published

2017
2017
2024
2024

Publication Types

Select...
9

Relationship

2
7

Authors

Journals

citations
Cited by 269 publications
(328 citation statements)
references
References 164 publications
2
324
1
1
Order By: Relevance
“…This subtraction removes terms – including the statistical biases described above – that cannot possibly carry speech information (because they are computed at fixed speech envelope). This results in an estimate that is more robust and more directly related to changes in the sensory input than classical transfer entropy (the same measure was termed directed feature information in [Ince et al, 2017, Ince et al, 2015]). DI was defined here astrue0DI(τBrain,τSpeech)=DI(τBrain)DI(τBrain)|Speechfalse(τSpeechfalse)…”
Section: Methodsmentioning
confidence: 99%
“…This subtraction removes terms – including the statistical biases described above – that cannot possibly carry speech information (because they are computed at fixed speech envelope). This results in an estimate that is more robust and more directly related to changes in the sensory input than classical transfer entropy (the same measure was termed directed feature information in [Ince et al, 2017, Ince et al, 2015]). DI was defined here astrue0DI(τBrain,τSpeech)=DI(τBrain)DI(τBrain)|Speechfalse(τSpeechfalse)…”
Section: Methodsmentioning
confidence: 99%
“…We computed the single-channel, joint, and conditional entropies of the EEG signals via explicit analytic expressions for the entropies based on the assumption that the EEG amplitudes realize continuous univariate and multivariate Gaussian processes with variances σii2 and covariance matrix K (Norwich, 1993; Tononi et al, 1994; van Putten and Stam, 2001; Ince et al, 2017):…”
Section: Methodsmentioning
confidence: 99%
“…We recently developed a lower-bound approximate estimator of mutual information for continuous signals based on a Gaussian copula [3]. The Gaussian I ccs measure therefore allows this approach to be used to obtain PIDs from experimental data.…”
Section: Continuous Gaussian Variablesmentioning
confidence: 99%
“…However, it also provides a comprehensive statistical framework for practical data analysis [3]. For example, mutual information is closely related to the log-likelihood ratio test of independence [4].…”
Section: Introductionmentioning
confidence: 99%