The 2006 IEEE International Joint Conference on Neural Network Proceedings 2006
DOI: 10.1109/ijcnn.2006.247170
|View full text |Cite
|
Sign up to set email alerts
|

Critical Values of a Kernel Density-based Mutual Information Estimator

Abstract: Abstract-Recently, mutual information (MI) has become widely recognized as a statistical measure of dependence that is suitable for applications where data are non-Gaussian, or where the dependency between variables is non-linear. However, a significant disadvantage of this measure is the inability to define an analytical expression for the distribution of MI estimators, which are based upon a finite dataset. This paper deals specifically with a popular kernel density based estimator, for which the distributio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2008
2008
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 16 publications
(32 reference statements)
0
2
0
Order By: Relevance
“…A higher mutual information value is a result of strong correlation, whereas a mutual information value of zero indicates uncorrelated variables (May et al 2006). Specifically, the mutual information M I (X, Y ) between random variables X and Y measures the amount of information in X that can be predicted when Y is known.…”
Section: Reviews Of Entropy and Mutual Informationmentioning
confidence: 99%
See 1 more Smart Citation
“…A higher mutual information value is a result of strong correlation, whereas a mutual information value of zero indicates uncorrelated variables (May et al 2006). Specifically, the mutual information M I (X, Y ) between random variables X and Y measures the amount of information in X that can be predicted when Y is known.…”
Section: Reviews Of Entropy and Mutual Informationmentioning
confidence: 99%
“…In addition, mutual information method can be applied to evaluate the dependence between two random variables and measures how much the entropy of one random variable is reduced by knowledge of another. Therefore, mutual information method is an ideal measure of stochastic dependence between two random variables (May et al 2006).…”
Section: Introductionmentioning
confidence: 99%