2019
DOI: 10.1214/18-aop1261
|View full text |Cite
|
Sign up to set email alerts
|

Rényi divergence and the central limit theorem

Abstract: We explore properties of the χ 2 and more general Rényi (Tsallis) distances to the normal law. In particular we provide necessary and sufficient conditions for the convergence to the normal law in the central limit theorem using these distances. Moreover, we derive exact rates of convergence in these distances with respect to an increasing number of summands.1991 Mathematics Subject Classification. Primary 60E.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(32 citation statements)
references
References 54 publications
(66 reference statements)
0
31
1
Order By: Relevance
“…This "triangular inequality"-like result exploits the nonnegativity of X, Y and captures the intrinsic cancellations of the 2 j terms of a Binomial expansion. If we do not have non-negativity, the standard expansion will provide a 2 j factor rather than 2 in the min{2, (e (∞) − 1) j } term (see e.g., Proposition 3.2 of Bobkov et al (2019)).…”
Section: Privacy Amplification For Rdpmentioning
confidence: 99%
“…This "triangular inequality"-like result exploits the nonnegativity of X, Y and captures the intrinsic cancellations of the 2 j terms of a Binomial expansion. If we do not have non-negativity, the standard expansion will provide a 2 j factor rather than 2 in the min{2, (e (∞) − 1) j } term (see e.g., Proposition 3.2 of Bobkov et al (2019)).…”
Section: Privacy Amplification For Rdpmentioning
confidence: 99%
“…Wyner [8] and Yu and Tan [16], [20], [21] respectively used the KL divergence and the Rényi divergence to measure the level of approximation in the distributed source synthesis problem; Hayashi [3], [4] used the KL divergence to study the channel resolvability problem, and showed the optimal decay exponents of the KL divergence and the total variation are upper bounded by an expression involving the Rényi divergence. In probability theory, Barron [22] and Bobkov, Chistyakov and Götze [23] respectively used the KL divergence and the Rényi divergence to study the central limit theorem, i.e., they used them to measure the discrepancy between the induced distribution of sum of i.i.d. random variables and the normal distribution with the same mean and variance.…”
Section: Introductionmentioning
confidence: 99%
“…Additional assumptions are thus necessary. Nevertheless, for all 0 < p ≤ q, D p (µ||ν) ≤ D q (µ||ν), and similarly for T p (see [10]). Moreover, one clearly has…”
Section: Introductionmentioning
confidence: 88%