1995
DOI: 10.1103/physreve.52.2318
|View full text |Cite
|
Sign up to set email alerts
|

Estimation of mutual information using kernel density estimators

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
291
0
1

Year Published

2010
2010
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 399 publications
(294 citation statements)
references
References 16 publications
2
291
0
1
Order By: Relevance
“…However, where stimuli and responses are both continuous (an example might be local field potential responses to a white noise sensory stimulus), it may be advantageous to take advantages of techniques better suited to continuous signals, such as kernel density estimators (Moon et al, 1995), nearest neighbor estimators (Kraskov et al, 2004) or binless metric space methods (Victor, 2002). We refer to the entry "Bin--Less Estimators for Information Quantities" for an in--depth discussion of these techniques.…”
Section: Binless Methods For Estimating Informationmentioning
confidence: 99%
“…However, where stimuli and responses are both continuous (an example might be local field potential responses to a white noise sensory stimulus), it may be advantageous to take advantages of techniques better suited to continuous signals, such as kernel density estimators (Moon et al, 1995), nearest neighbor estimators (Kraskov et al, 2004) or binless metric space methods (Victor, 2002). We refer to the entry "Bin--Less Estimators for Information Quantities" for an in--depth discussion of these techniques.…”
Section: Binless Methods For Estimating Informationmentioning
confidence: 99%
“…An MI value equal to 0 indicates complete independence between x and y, while a higher MI value indicates stronger dependence between x and y (Fraser and Swinney, 1986;Moon et al, 1995).…”
Section: Determination Of Input Variablesmentioning
confidence: 99%
“…If we have two kinds of time series data sets such as (s 1 , s 2 , s 3 , · · ·, s n , q 1 , q 2 , q 3 , · · ·, q n ), where n is the observed periods, then the MI value between observation s i and q j is defined by Moon et al (1995) as follows: MI s,q s i , q j = log 2 P s, q s i , q j P s (s i ) P q q j .…”
Section: Mutual Information Techniquesmentioning
confidence: 99%