The 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society
DOI: 10.1109/iembs.2004.1403100
|View full text |Cite
|
Sign up to set email alerts
|

An approximate method for Bayesian entropy estimation for a discrete random variable

Abstract: This article proposes an approximated Bayesian entropy estimator for a discrete random variable. An entropy estimator that achieves least square error is obtained through Bayesian estimation of the occurrence probabilities of each value taken by the discrete random variable. This Bayesian entropy estimator requires large amount of calculation cost if the random variable takes numerous sorts of values. Therefore, the present article proposes a practical method for calculating an Bayesian entropy estimate; the p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
10
0

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(13 citation statements)
references
References 6 publications
3
10
0
Order By: Relevance
“…It can be seen that both of the curves, estimated from the time series of the instantaneous phase at narrowband, contain a small linear segment in which the slope converges to constant values as the number of data points increase (number of data points is from 8,000 to 10,000 in this plot). Such a result is consistent with a paper published in 2005 by Yokota [32] , who reported that the entropy value estimated by the conventional Shannon method will not get close to the true value until the samples reach 5,000.…”
Section: Convergence Of Directional Index Calculation Over 8000 Datasupporting
confidence: 82%
“…It can be seen that both of the curves, estimated from the time series of the instantaneous phase at narrowband, contain a small linear segment in which the slope converges to constant values as the number of data points increase (number of data points is from 8,000 to 10,000 in this plot). Such a result is consistent with a paper published in 2005 by Yokota [32] , who reported that the entropy value estimated by the conventional Shannon method will not get close to the true value until the samples reach 5,000.…”
Section: Convergence Of Directional Index Calculation Over 8000 Datasupporting
confidence: 82%
“…If our key mechanism would not be used, this approximate entropy estimator could be used with the other information of message such as source identification, location information, or sensory data themselves for distinguishing unusual events. We pursued to simplify the entropy estimation considering limited capacity of nodes and result in accurate estimation, thus we used an approximate method for Bayesian entropy estimation [9]. Entropy estimation is one of statistical approaches used for distribution comparison where the measurements involved are discrete values.…”
Section: Approximate Entropy Estimator For a Discrete Random Variablementioning
confidence: 99%
“…M-ary entropy function H for a discrete random variable X which takes M sorts of values {a 1 log . Thus, in conventional entropy estimation, occurrence probabilities r of values a are estimated by the maximal likelihood estimation method, i.e., ř = n/N, in which values a are assumed to be included respectively for n ≡ (n 1 ,...,n M ) in the observed sample set of size N. However, unfortunately, the conventional entropy estimator is not optimum in the meaning of least square error [9].…”
Section: Approximate Entropy Estimator For a Discrete Random Variablementioning
confidence: 99%
See 2 more Smart Citations