2005
DOI: 10.1080/104852504200026815
|View full text |Cite
|
Sign up to set email alerts
|

A new class of random vector entropy estimators and its applications in testing statistical hypotheses

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
136
0

Year Published

2006
2006
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 130 publications
(144 citation statements)
references
References 17 publications
1
136
0
Order By: Relevance
“…A more general class of entropy estimators including the one above was shown to be asymptotically unbiased and consistent as N → ∞ in [8]. Using the entropy estimate we 1 We omit arrows from our network visualizations and implicitly assume the orientations to be directed away from the hip node.…”
Section: Learning the Graph Structurementioning
confidence: 99%
“…A more general class of entropy estimators including the one above was shown to be asymptotically unbiased and consistent as N → ∞ in [8]. Using the entropy estimate we 1 We omit arrows from our network visualizations and implicitly assume the orientations to be directed away from the hip node.…”
Section: Learning the Graph Structurementioning
confidence: 99%
“…A correction of the bias has been derived in [31] in a different context. In the non-biased estimators of the (cross)-entropy the digamma function ψ(k) replaces the log(k) term:…”
Section: Non-parametric Estimation Of the Similarity Measurementioning
confidence: 99%
“…In this paper we derive convergence rates for data-split versions of k-nearest neighbor (k-NN) estimators of Shannon and Rényi entropies proposed by Goria et.al. [12] and Leonenko et.al. [13] respectively.…”
Section: Introductionmentioning
confidence: 99%
“…Goria et.al. [12] and Leonenko et.al. [13] show that the estimators they propose are asymptotically unbiased and consistent.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation