2014
DOI: 10.1371/journal.pone.0102833
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

Abstract: Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume s… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
122
0
7

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 114 publications
(129 citation statements)
references
References 106 publications
(178 reference statements)
0
122
0
7
Order By: Relevance
“…While this eases considerably the estimation task, when the data distribution departs from Gaussianity the formulations in Equation (11) become approximate expressions of information dynamics, and the adopted estimator may miss dependence structures that originate from nonlinear dynamics. In such a case it is appropriate to resort to model-free computation methods, preferably those recently devised to tackle the difficult task of non-parametric entropy estimation in high dimensions [16,55,56] or for non-stationary data [57].…”
Section: Discussionmentioning
confidence: 99%
“…While this eases considerably the estimation task, when the data distribution departs from Gaussianity the formulations in Equation (11) become approximate expressions of information dynamics, and the adopted estimator may miss dependence structures that originate from nonlinear dynamics. In such a case it is appropriate to resort to model-free computation methods, preferably those recently devised to tackle the difficult task of non-parametric entropy estimation in high dimensions [16,55,56] or for non-stationary data [57].…”
Section: Discussionmentioning
confidence: 99%
“…Attempts at formally defining information modification have presented a considerable challenge, however, in contrast to the well established measures of information transfer [2][3][4][5][6] and active information storage [7][8][9]. This is because identifying the "modified" information in the output of a processing element amounts to distinguishing it from the information from any input that survives the passage through the processor in unmodified form.…”
Section: Introductionmentioning
confidence: 99%
“…Linear GC-based measures such as AR-based gPDC are time-variant and multivariate, and often also work in the presence of nonlinearity (for example see [35], comprehensive comparison of CCM and GC in the supplement of [13]). Because a time-varying investigation of TE is not trivial [12], TE was not adapted in our current study but should be investigated for these data in the future.…”
Section: Results Of Investigations Of Bivariate Ccm Are Given In Figumentioning
confidence: 99%
“…The transfer entropy (TE) [10,11] is a model-free measure of information transfer and is able to generalize GC to the detection of many different types of linear and nonlinear interactions in multivariate time series [9]. Time-variant investigations by means of TE are afflicted with the problem of data length needed for estimation; for the efficiency of TE in analyzing nonstationary data see [12].…”
Section: Introductionmentioning
confidence: 99%