2021
DOI: 10.3390/e23060740
|View full text |Cite
|
Sign up to set email alerts
|

Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples

Abstract: We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equal-width bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (pdf) into equal-probability-mass intervals. And, whereas BC and KD each require opt… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 35 publications
0
1
0
Order By: Relevance
“…For reference, we compare these entropies at each quantization level to the maximum possible entropy ( , black lines in Figure 5 ). For forcing variables with a relatively normal distribution, such as Ta , we find that the entropy using the Lloyd algorithm is close to , indicating that the threshold points are similar to quantiles of the data [ 54 ] in that there is a similar number of data points within each pair of threshold values. Meanwhile, the fixed binning entropy for Ta is also relatively high.…”
Section: Resultsmentioning
confidence: 94%
“…For reference, we compare these entropies at each quantization level to the maximum possible entropy ( , black lines in Figure 5 ). For forcing variables with a relatively normal distribution, such as Ta , we find that the entropy using the Lloyd algorithm is close to , indicating that the threshold points are similar to quantiles of the data [ 54 ] in that there is a similar number of data points within each pair of threshold values. Meanwhile, the fixed binning entropy for Ta is also relatively high.…”
Section: Resultsmentioning
confidence: 94%