2018
DOI: 10.3390/e20080560
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Estimation of Information Divergence †

Abstract: Recent work has focused on the problem of nonparametric estimation of information divergence functionals between two continuous random variables. Many existing approaches require either restrictive assumptions about the density support set or difficult calculations at the support set boundary which must be known a priori. The mean squared error (MSE) convergence rate of a leave-one-out kernel density plug-in divergence functional estimator for general bounded density support sets is derived where knowledge of … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 19 publications
(27 citation statements)
references
References 71 publications
0
27
0
Order By: Relevance
“…Thus, with these parameters, a negligible bias requires n to be at least 2 0.99d . 8 The Q-function is defined as Q(x)…”
Section: Bias Lower Boundmentioning
confidence: 99%
“…Thus, with these parameters, a negligible bias requires n to be at least 2 0.99d . 8 The Q-function is defined as Q(x)…”
Section: Bias Lower Boundmentioning
confidence: 99%
“…Thus, we quantify dependence by D α (g,g), the Rényi divergence of g(y, z) with respect tog(y, z). This can also be used to define the Rényi mutual information [16,17] as:…”
Section: Entropy As Measure Of Dependencementioning
confidence: 99%
“…Noshad et al [26] circumvented the plug-in density and proposed a direct estimation method based on a graph-theoretical interpretation. Very recently, Moon et al [17] derived mean squared error convergence rates of kernel density-based plug-in estimators of mutual information and proposed ensemble estimators that achieve the parametric rate, although there are several restrictions on the densities and the kernel used.…”
Section: Estimator Of the Rényi Entropymentioning
confidence: 99%
See 2 more Smart Citations