2019
DOI: 10.1515/advgeom-2018-0030
|View full text |Cite
|
Sign up to set email alerts
|

Variance estimates and almost Euclidean structure

Abstract: We introduce and initiate the study of new parameters associated with any norm and any log-concave measure on R n , which provide sharp distributional inequalities. In the Gaussian context this investigation sheds light to the importance of the statistical measures of dispersion of the norm in connection with the local structure of the ambient space. As a byproduct of our study, we provide a short proof of Dvoretzky's theorem which not only supports the aforementioned significance but also complements the clas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0
2

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 16 publications
(19 citation statements)
references
References 55 publications
(63 reference statements)
0
17
0
2
Order By: Relevance
“…we get M ≥ ξ for all p ≥ 1. In the range 1 ≤ p ≤ 2 log n log(2e) , the assertion trivially follows from (20) and the estimate M ≥ ξ. For 2 log n log(2e) ≤ p ≤ ξ 2 , we have 2p ≥ n 2/p p/e = M 2 , and as n 2/p > e, we get M 2 > p. In the interval ξ 2 < p the statement follows from the definition of M. Lemma 4.6.…”
Section: Upper Bounds For the Variancementioning
confidence: 76%
See 1 more Smart Citation
“…we get M ≥ ξ for all p ≥ 1. In the range 1 ≤ p ≤ 2 log n log(2e) , the assertion trivially follows from (20) and the estimate M ≥ ξ. For 2 log n log(2e) ≤ p ≤ ξ 2 , we have 2p ≥ n 2/p p/e = M 2 , and as n 2/p > e, we get M 2 > p. In the interval ξ 2 < p the statement follows from the definition of M. Lemma 4.6.…”
Section: Upper Bounds For the Variancementioning
confidence: 76%
“…In particular, the Dvoretzky-Rogers lemma implies that for any norm · with the unit ball in John's position, the random cε 2 log n log(1/ε) -dimensional subspace is (1 + ε)-spherical with large probability. We refer to monographs and surveys [14,22,26,1] for more information as well as to papers [21,18,19,29,20] for some recent developments of the subject. In this text, we leave out any discussion of the existential Dvoretzky theorem which is concerned with finding at least one large almost Euclidean subspace (the best known general result in this direction is due to Schechtman [24]) as well as the isomorphic Dvoretzky theorem which deals with the regime when distortion ε grows to infinity with n (see, in particular, [15]).…”
Section: Introductionmentioning
confidence: 99%
“…This new type of concentration inequality (1.3) exploits the convexity properties of the Gaussian measure, as opposed to (1.1) which can be explained by isoperimetry. Corresponding estimates can therefore be proved for arbitrary log-concave measures; see [PV17b]. All of these suggest that the left and right distributional behaviors near the median should be treated separately.…”
Section: Introductionmentioning
confidence: 92%
“…On the parameter β. The following parameter, referred to as the normalized variance is introduced in [PV18] (see also [PV17]) for the study of sharp Gaussian small deviation inequalities and small ball probabilities for norms. For any normed space X = (R n , · ) we define β(X) = β(B X ) = Var( G ) (E G ) 2 , G ∼ N(0, I n ).…”
Section: Further Remarks and Questionsmentioning
confidence: 99%
“…It is easy to see that two equivalent norms have Dvoretzky's number of the same order and therefore, by (1.3), they exhibit the same large deviation estimate. However, one may find two norms on R n which are 2-equivalent and the variance of one is polynomially small while the variance of the other is only logarithmically small (with respect to the dimension), see for example [PVZ17], [PV17] and [LT17]. We should mention that when the norm under consideration is close to the ℓ 2 -norm then it automatically exhibits the optimal concentration in terms of ε and n (Section 5, §1).…”
Section: Introductionmentioning
confidence: 99%