2008
DOI: 10.1590/s0103-97332008000100005
|View full text |Cite
|
Sign up to set email alerts
|

A numerical study of the Kullback-Leibler distance in functional magnetic resonance imaging

Abstract: The Kullback-Leibler distance (or relative entropy) is applied in the analysis of functional magnetic resonance (fMRI) data series. Our study is designed for event-related (ER) experiments, where a brief stimulus is presented and a long period of rest is followed. In particular, this relative entropy is used as a measure of the "distance" between the probability distributions p 1 and p 2 of the signal levels related to stimulus and non-stimulus. In order to avoid undesirable divergences of the Kullback-Leibler… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(14 citation statements)
references
References 15 publications
0
14
0
Order By: Relevance
“…However, we should also mention that the same kind of analysis has been already performed with the usual Kullback-Leibler entropy D (which corresponds to the case q = 1 and so is subject to divergence) [12]. But in this study, the probabilities of the signal levels were subtly redefined as p ij = (n ij + δ)/(n i + Lδ), 0 < δ ≤ 1, in order to prevent the occurrence of divergent values of D. In fact, this latter definition ensures that p ij is non-null and tends to recover the usual meaning that p ij is the relative frequency of points within window W i and level I j for small values of δ (that is, p ij → n ij /n i as δ → 0 + ).…”
Section: Generalized Relative Entropymentioning
confidence: 99%
See 3 more Smart Citations
“…However, we should also mention that the same kind of analysis has been already performed with the usual Kullback-Leibler entropy D (which corresponds to the case q = 1 and so is subject to divergence) [12]. But in this study, the probabilities of the signal levels were subtly redefined as p ij = (n ij + δ)/(n i + Lδ), 0 < δ ≤ 1, in order to prevent the occurrence of divergent values of D. In fact, this latter definition ensures that p ij is non-null and tends to recover the usual meaning that p ij is the relative frequency of points within window W i and level I j for small values of δ (that is, p ij → n ij /n i as δ → 0 + ).…”
Section: Generalized Relative Entropymentioning
confidence: 99%
“…(1) might tend to zero if p 1j → 0 (p 2j = 0) and might diverge if p 2j → 0 (p 1j = 0). In a previous study [12], the undesirable divergence of Kullback-Leibler distance (D) was prevented by an appropriate definition of the set of probabilities p ij . Within a nonextensive thermostatistical formalism, the generalized Kullback-Leibler distance D q has been derived as the q-average of the generalized change of information ∆σ q,j = k [14], that is,…”
Section: Generalized Relative Entropymentioning
confidence: 99%
See 2 more Smart Citations
“…The K L diverges for (ξ) = 0 and (ξ) = 0, as defined in equation (1). In order to avoid the division by zero, a small positive constant δ = 0 001 was added to each bin, and then the histogram is normalized to add 1 [7,24]. The original histogram is used to compute the Hellinger distance once it does not impose the positivity restriction on the probabilities.…”
Section: Perturbationsmentioning
confidence: 99%