2014 IEEE International Symposium on Information Theory 2014
DOI: 10.1109/isit.2014.6874854
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble estimation of multivariate f-divergence

Abstract: f -divergence estimation is an important problem in the fields of information theory, machine learning, and statistics. While several divergence estimators exist, relatively few of their convergence rates are known. We derive the MSE convergence rate for a density plug-in estimator of f -divergence. Then by applying the theory of optimally weighted ensemble estimation, we derive a divergence estimator with a convergence rate of O 1 T that is simple to implement and performs well in high dimensions. We validate… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
69
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
4

Relationship

4
4

Authors

Journals

citations
Cited by 43 publications
(71 citation statements)
references
References 17 publications
(34 reference statements)
2
69
0
Order By: Relevance
“…There has been recent interest in deriving convergence rates for divergence estimators Moon andHero (2014)-Krishnamurthy et al (2014). The rates are typically derived in terms of smoothness s of the densities :…”
Section: Previous Workmentioning
confidence: 99%
“…There has been recent interest in deriving convergence rates for divergence estimators Moon andHero (2014)-Krishnamurthy et al (2014). The rates are typically derived in terms of smoothness s of the densities :…”
Section: Previous Workmentioning
confidence: 99%
“…we use the nonparametric estimator derived in Moon & Hero III (2014a, 2014b that is based on the k-nearest neighbor density estimators for the densities f i and f j . This estimator is simple to implement and achieves the parametric convergence rate when the densities are sufficiently smooth.…”
Section: Svd Advantages Nmf Advantages Optimal Rank R Approximationmentioning
confidence: 99%
“…We then estimate the Hellinger distance H(f a , f b ) using the divergence estimator in Moon & Hero III (2014a).…”
Section: Hellinger Distance Comparisonsmentioning
confidence: 99%
“…Information theoretic measures such as Shannon entropy, mutual information, and the Kullback-Leibler (KL) divergence have a broad range of applications in information theory, statistics and machine learning [1][2][3]. When we have two or more data sets and we are interested in finding the correlation or dissimilarity between them, Shannon mutual information or KL-divergence is often used.…”
Section: Introductionmentioning
confidence: 99%