2004
DOI: 10.1002/ecjc.10163
|View full text |Cite
|
Sign up to set email alerts
|

An entropy estimator improving mean squared error

Abstract: SUMMARYEntropy estimation for a memory-less information source has mainly been performed by the following process under the condition that the occurrence probability of each source symbol is unknown: First, the occurrence probabilities of source symbols are estimated; then the entropy is estimated by substituting the estimated occurrence probabilities into the entropy function. Such an entropy estimation is not optimum in the sense of least squares error; it causes a large error, particularly for small sample … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…Such a Bayesian entropy estimator has already been proposed for a random variable that takes only two sorts of values [6], [7]. However, a Bayesian entropy estimator for a random variable that takes numerous sorts of values requires large amount of calculation cost generally.…”
Section: Introductionmentioning
confidence: 98%
“…Such a Bayesian entropy estimator has already been proposed for a random variable that takes only two sorts of values [6], [7]. However, a Bayesian entropy estimator for a random variable that takes numerous sorts of values requires large amount of calculation cost generally.…”
Section: Introductionmentioning
confidence: 98%