1987
DOI: 10.1109/tsmc.1987.4309069
|View full text |Cite
|
Sign up to set email alerts
|

Entropy and Correlation: Some Comments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
86
0
5

Year Published

1997
1997
2016
2016

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 197 publications
(94 citation statements)
references
References 4 publications
3
86
0
5
Order By: Relevance
“…where one way relationship exists. Kvålseth (1987) has briefly defined an information-theoretic measure of dependence between two variables in his paper. This measure has several desirable properties.…”
Section: Some Methods Commonly Used To Measure Dependence Between Varmentioning
confidence: 99%
See 1 more Smart Citation
“…where one way relationship exists. Kvålseth (1987) has briefly defined an information-theoretic measure of dependence between two variables in his paper. This measure has several desirable properties.…”
Section: Some Methods Commonly Used To Measure Dependence Between Varmentioning
confidence: 99%
“…KM 2 has been used by Kraskov and Grassberger (2009) for clustering. Kvålseth (1987) mentions that the value of all three measures is 1 in case of strict one to one association between the variables. However, for KM 1 this is a sufficient condition but not a necessary condition.…”
Section: Existing Mutual Information Based Measuresmentioning
confidence: 99%
“…Performance measures: Four standard measurements are used to evaluate the quality of the clusters: purity [14] measures the accuracy of the dominating class in each cluster, normalized mutual information (Normalized MI) [23] is from the information-theoretic perspective and calculates the mutual dependence of the predicted clustering and the ground-truth partitions, Rand index [24] evaluates true positives within clusters and true negatives between clusters and balanced F-measure considers both precision and recall.…”
Section: Binary Tags and Visual Featuresmentioning
confidence: 99%
“…Furthermore, measures of one-way association can be expressed in a general form as different normalizations of conditional entropy, while measures of two-way association as different normalizations of mutual information [9].…”
Section: Measuring Attribute Importancementioning
confidence: 99%