2023
DOI: 10.48550/arxiv.2301.13647
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Bayesian estimation of information-theoretic metrics for sparsely sampled distributions

Abstract: Estimating the Shannon entropy of a discrete distribution from which we have only observed a small sample is challenging. Estimating other information-theoretic metrics, such as the Kullback-Leibler divergence between two sparsely sampled discrete distributions, is even harder. Existing approaches to address these problems have shortcomings: they are biased, heuristic, work only for some distributions, and/or cannot be applied to all information-theoretic metrics. Here, we propose a fast, semi-analytical estim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…Finally, we stress that there are alternative entropies not considered here [ 67 ], for which the existence of accurate estimators is still an open question. Finally, an exciting possibility would be a comparative study of estimators valid for more than one random variable or probability distributions, leading, respectively, to mutual information [ 68 , 69 ] and relative entropy [ 47 , 70 , 71 ].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, we stress that there are alternative entropies not considered here [ 67 ], for which the existence of accurate estimators is still an open question. Finally, an exciting possibility would be a comparative study of estimators valid for more than one random variable or probability distributions, leading, respectively, to mutual information [ 68 , 69 ] and relative entropy [ 47 , 70 , 71 ].…”
Section: Discussionmentioning
confidence: 99%
“…Even though there exists a plethora of entropy estimators in the literature [15,[39][40][41][42][43][44][45][46][47], we here focus on nine of the most commonly employed estimators, and we also propose a new estimator, constructed from known results [34].…”
Section: Methodsmentioning
confidence: 99%