2003
DOI: 10.1016/s1053-8119(03)00306-9
|View full text |Cite
|
Sign up to set email alerts
|

Shannon entropy applied to the analysis of event-related fMRI time series

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
48
0
3

Year Published

2007
2007
2019
2019

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(55 citation statements)
references
References 22 publications
1
48
0
3
Order By: Relevance
“…Therefore, fMRI has motivated a number of researchers to perform studies on brain recovery after stroke, most of which are based on blood oxygenation level dependent (BOLD) mechanism that rely on hemoglobin oxidative state changes. To perform BOLD-fMRI one needs to assume that the signal is stable throughout data acquisition 1 . However, many confounders may disturb such assumption, as disrupted neurovascular coupling in patients with cerebrovascular diseases 2 .…”
mentioning
confidence: 99%
“…Therefore, fMRI has motivated a number of researchers to perform studies on brain recovery after stroke, most of which are based on blood oxygenation level dependent (BOLD) mechanism that rely on hemoglobin oxidative state changes. To perform BOLD-fMRI one needs to assume that the signal is stable throughout data acquisition 1 . However, many confounders may disturb such assumption, as disrupted neurovascular coupling in patients with cerebrovascular diseases 2 .…”
mentioning
confidence: 99%
“…During that epoch the time course s(t) of the (simulated) BOLD signal of an active voxel exhibits a peak and then returns to a baseline level (Fig.1). As for the calculation of Shannon entropy in a previous work [7], an entire epoch (window W ) of the experiment is divided into two time intervals: half windows W 1 (related to the signal increase) and W 2 (corresponding mainly to baseline values). The probability distribution of signal levels (p 1 ) corresponding to the BOLD response in the first time period is clearly expected to differ from that one (p 2 ) corresponding to the baseline signal values so that the relative entropy within this epoch is calculated as a measure of the "distance" between p 1 and p 2 .…”
Section: Relative Entropymentioning
confidence: 99%
“…On the other hand, statistical methods can infer how significative is the difference between the signals corresponding to periods of stimulation and non-stimulation but do not need a priori knowledge of the form of the HRF. In the late years, methods based on information measures, such as the Shannon and Tsallis entropies and the generalized mutual information, have been employed as alternatives for the conventional analysis of fMRI data [5][6][7][8][9].…”
Section: Introductionmentioning
confidence: 99%
“…Some examples include the study of single cell information coding in the fly [de Ruyter van Steveninck et al, 1997] and primate visual systems [Victor, 2000;Reich et al, 2001;Simoncelli and Olshausen, 2001;Kang et al, 2004], in the primary motor cortex [Paz et al, 2003], neural and population coding in general [Borst and Theunissen, 1999;Panzeri et al, 2003;Schneidman et al, 2003], as well as the study of cortical synaptic communication [Fuhrmann et al, 2002;Goldman et al, 2002]. However, to our knowledge, except for a few applications using ICA [McKeown et al, 1998;Arfanakis et al, 2000;Calhoun et al, 2000;Moritz et al, 2000] for cluster analysis, only one study has made an attempt to apply information theory for the analysis of fMRI time-series, by breaking the event-related responses into two epochs and computing the entropy at each half [de Araujo et al, 2003]. This analysis does not assume a constant shape of an HRF, but nevertheless assumes a general structure of signal with a maximal change at the first half of the eventrelated response.…”
Section: Introductionmentioning
confidence: 99%