2018
DOI: 10.1162/neco_a_01092
|View full text |Cite
|
Sign up to set email alerts
|

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence

Abstract: Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

1
22
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 20 publications
(23 citation statements)
references
References 23 publications
1
22
0
Order By: Relevance
“…Indeed, Noh et al (2014) showed that the bias of the NNDE-based KL-divergence approximator at x is approximately proportional to…”
Section: Metric Learning With Parametric Bias Estimation For Nnde-basmentioning
confidence: 99%
See 4 more Smart Citations
“…Indeed, Noh et al (2014) showed that the bias of the NNDE-based KL-divergence approximator at x is approximately proportional to…”
Section: Metric Learning With Parametric Bias Estimation For Nnde-basmentioning
confidence: 99%
“…One way to reduce the bias is to learn appropriate distance metric in the KL-divergence approximator, equation 4.1, while the KL-divergence itself is metric invariant. In Noh et al (2014), the best local Mahalanobis metric (x − x ) A(x − x ) at x that minimizes the estimated bias was given as the solution of the following optimization problem:…”
Section: Metric Learning With Parametric Bias Estimation For Nnde-basmentioning
confidence: 99%
See 3 more Smart Citations