Manufacturing Engineer
DOI: 10.1109/itw.2004.1405277
|View full text |Cite
|
Sign up to set email alerts
|

Entropy, compound Poisson approximation, log-Sobolev inequalities and measure concentration

Abstract: Abslmct -T h e problem of a p p r o x i m a t i n g t h e dist r i b u t i o n of a s u m S, = C:=,U, of n d i s c r e t e random variables Y, by a Poisson or a c o m p o u n d Pois-s o n distribution arises naturally in m a n y classical a n d c u r r e n t applications, such a statistical genetics, dynamical systems, t h e recurrence properties of Markov processes a n d reliability theory. Using informationtheoretic ideas a n d techniques, we derive a family of n e w h o u n d s for compound Poisson approxim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
4
0

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 14 publications
0
4
0
Order By: Relevance
“…Letting M have a Poisson distribution in (1) yields the special case of the compound Poisson, which plays an important role in limit theorems and approximation bounds for discrete random variables; see, for example, [2], [3]. Recently, Kontoyiannis and Madiman [18], Madiman et al [20], and Johnson et al [13] have explored compound Poisson approximation and limit theorems using information theoretic ideas, extending the results of [17] and [12] for the Poisson (see also [8], [9], [32]). As a first step toward a compound Poisson limit theorem with the same appealing "entropy increasing to the maximum" interpretation as the central limit theorem ([4], [1], [19], [27]), we need to identify a suitable class of distributions among which the compound Poisson has maximum entropy ( [13]).…”
Section: Introduction and Main Resultsmentioning
confidence: 99%
“…Letting M have a Poisson distribution in (1) yields the special case of the compound Poisson, which plays an important role in limit theorems and approximation bounds for discrete random variables; see, for example, [2], [3]. Recently, Kontoyiannis and Madiman [18], Madiman et al [20], and Johnson et al [13] have explored compound Poisson approximation and limit theorems using information theoretic ideas, extending the results of [17] and [12] for the Poisson (see also [8], [9], [32]). As a first step toward a compound Poisson limit theorem with the same appealing "entropy increasing to the maximum" interpretation as the central limit theorem ([4], [1], [19], [27]), we need to identify a suitable class of distributions among which the compound Poisson has maximum entropy ( [13]).…”
Section: Introduction and Main Resultsmentioning
confidence: 99%
“…Regarding Poincaré and logarithmic Sobolev inequalities, it is clear that a finite mixture of Gaussians will satisfies such inequalities since its log-density is a bounded perturbation of a uniformly concave function. [30,31] in connection with compound Poisson processes and discrete modified logarithmic Sobolev inequalities.…”
Section: Introductionmentioning
confidence: 99%
“…Here again, the bound blows up when the minimum weight of the mixing law goes to 0. Some aspects of Poisson mixtures are considered by Kontoyannis and Madiman [30,31] in connection with compound Poisson processes and discrete modified logarithmic Sobolev inequalities.…”
Section: Introductionmentioning
confidence: 99%
“…The so-called entropy method, introduced Herbst [31] and developed by Marton [82] and Ledoux [74,75,76], has been one of the key approaches to proving concentration of measure inequalities [18,85], sometimes also in connection with ideas from optimal transport [16,98]. Poisson approximation [59,70,60,61] and compound Poisson approximation [72,7,67] have been extensively studied via an information-theoretic lens. The profound relationship between information theory and functional-analytic inequalities has a long history, dating back to the work of Shannon [90], Stam [92] and Blachman [15] on the entropy power inequality [36].…”
Section: Information In Probabilitymentioning
confidence: 99%