2013
DOI: 10.1214/12-aop780
|View full text |Cite
|
Sign up to set email alerts
|

Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem

Abstract: An Edgeworth-type expansion is established for the entropy distance to the class of normal distributions of sums of i.i.d. random variables or vectors, satisfying minimal moment conditions.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
41
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 43 publications
(43 citation statements)
references
References 23 publications
1
41
0
Order By: Relevance
“…These assumptions can be found in Bobkov et al . (). We remark that we are focusing on the fixed dimension setting.…”
Section: Statistical Inference Via Langevin Diffusionmentioning
confidence: 97%
See 2 more Smart Citations
“…These assumptions can be found in Bobkov et al . (). We remark that we are focusing on the fixed dimension setting.…”
Section: Statistical Inference Via Langevin Diffusionmentioning
confidence: 97%
“…The proof is based on the entropic CLT (Barron, 1986;Bobkov et al, 2013Bobkov et al, , 2014. The classic CLT based on convergence in distribution is too weak for our purpose: we need to translate the non-asymptotic bounds at each step to the whole stochastic process.…”
Section: Statistical Inference Via Langevin Diffusionmentioning
confidence: 99%
See 1 more Smart Citation
“…This representation should be compared to the one obtained in Proposition 2.4 (iii) of [25]. At the core of its proof stand the Bismut-type representation of ∂ σ x (P (α,λ) τ (f )) as well as the intertwining relation (9). We assume that the random variable X has a density f X such that f = f X /γ α,λ is smooth enough for the different analytical arguments to hold.…”
Section: A New Hsi-type Inequalitymentioning
confidence: 98%
“…More recent work of Bobkov, Chistyakov and Götze [19,20] used properties of characteristic functions to remove the assumption of finite Poincaré constant, and even extended this theory to include convergence to other stable laws [18,21]. This is a very useful result in many cases, and gives us well-known facts such as that entropy is maximised under a variance constraint by the Gaussian density.…”
Section: [Normal Approximation]mentioning
confidence: 99%