2001
DOI: 10.1109/18.930936
|View full text |Cite
|
Sign up to set email alerts
|

Binomial and Poisson distributions as maximum entropy distributions

Abstract: The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence in information divergence is established.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
89
0

Year Published

2003
2003
2022
2022

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 115 publications
(92 citation statements)
references
References 13 publications
3
89
0
Order By: Relevance
“…Yasaee for their helpful comments. We are specifically indebted to the two anonymous reviewers and the associate editor for their helpful comments; in fact, Dr. Gerhard Kramer has brought to our attentions the similarity of our upper bound for the noiseless case with that of T-codes developed by Chang and Weldon [24], and the proof that our conjectured upper bound is a true upper bound for the noiseless case (Section 4) based on the works of Shepp and Olgin [27] and the two corresponding papers [28]- [29].…”
Section: Acknowledgementmentioning
confidence: 70%
“…Yasaee for their helpful comments. We are specifically indebted to the two anonymous reviewers and the associate editor for their helpful comments; in fact, Dr. Gerhard Kramer has brought to our attentions the similarity of our upper bound for the noiseless case with that of T-codes developed by Chang and Weldon [24], and the proof that our conjectured upper bound is a true upper bound for the noiseless case (Section 4) based on the works of Shepp and Olgin [27] and the two corresponding papers [28]- [29].…”
Section: Acknowledgementmentioning
confidence: 70%
“…Put f (k) = log (Pr (P o (λ in ) + Y = k)) . All Bernoulli sums are log-concave [12] and therefore the same property holds for generalized Bernoulli sums. According to [14,Theorem 6.1] it is sufficient to prove that E (f (X)) is minimal for the Poisson distribution, but this follows from Lemma 3.…”
Section: Proof Rewrite the Mutual Information Asmentioning
confidence: 82%
“…, n}. Under the constraint that n i=1 E[X i ] ≤ λ, it follows from the maximal entropy result in [11], [14] and [25] that the entropy of Y is maximized when the n independent inputs are i.i.d. with mean p = λ n , and consequently the channel output Y is Binomially distributed with Y ∼ Binom n, λ n .…”
Section: Ieee International Symposium On Information Theorymentioning
confidence: 99%