2014
DOI: 10.1504/ijmissp.2014.066430
|View full text |Cite
|
Sign up to set email alerts
|

Confidence intervals for the mutual information

Abstract: THIS PAPER IS ELIGIBLE FOR THE STUDENT PAPER AWARD"By combining a bound on the absolute value of the difference of mutual information between two joint probability distributions with a fixed variational distance, and a bound on the probability of a maximal deviation in variational distance between a true joint probability distribution and an empirical joint probability distribution, confidence intervals for the mutual information of two random variables with finite alphabets are established. Different from pre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…Together with the bound on the probability of a maximal variational distance between the true joint distribution and an empirical joint distribution (see [6], and especially an refinement of it which drops the dependence on the true distribution [4, Lemma 3]) the given bound can be used to construct a reasonably tight lower bound of the confidence interval for mutual information. Such an application can be found in [8]. In mutual information estimation with confidence intervals, the bound given is especially useful, when the marginal probability distribuition is far from being uniform.…”
Section: Discussionmentioning
confidence: 99%
“…Together with the bound on the probability of a maximal variational distance between the true joint distribution and an empirical joint distribution (see [6], and especially an refinement of it which drops the dependence on the true distribution [4, Lemma 3]) the given bound can be used to construct a reasonably tight lower bound of the confidence interval for mutual information. Such an application can be found in [8]. In mutual information estimation with confidence intervals, the bound given is especially useful, when the marginal probability distribuition is far from being uniform.…”
Section: Discussionmentioning
confidence: 99%