2019
DOI: 10.1214/19-ejp280
|View full text |Cite
|
Sign up to set email alerts
|

A note on concentration for polynomials in the Ising model

Abstract: We present precise multilevel exponential concentration inequalities for polynomials in Ising models satisfying the Dobrushin condition. The estimates have the same form as two-sided tail estimates for polynomials in Gaussian variables due to Latała. In particular, for quadratic forms we obtain a Hanson-Wright type inequality.We also prove concentration results for convex functions and estimates for nonnegative definite quadratic forms, analogous as for quadratic forms in i.i.d. Rademacher variables, for more … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
35
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(38 citation statements)
references
References 40 publications
3
35
0
Order By: Relevance
“…Remark 2.5. In the case that A = {A} our result implies the upper tail of the recent concentration inequality proved in Adamczak et al (2018a) (see Theorem 2.2 and Example 2.5). To show this fact (denoting σ = σ − Eσ) we observe that…”
Section: Uniform Concentration Results In the Ising Modelsupporting
confidence: 67%
See 1 more Smart Citation
“…Remark 2.5. In the case that A = {A} our result implies the upper tail of the recent concentration inequality proved in Adamczak et al (2018a) (see Theorem 2.2 and Example 2.5). To show this fact (denoting σ = σ − Eσ) we observe that…”
Section: Uniform Concentration Results In the Ising Modelsupporting
confidence: 67%
“…Furthermore, we apply our techniques to obtain a uniform concentration result similar to Theorem 1.1 in a particular case of non-independent components. We consider the Ising model under Dobrushin's condition, a setting that has been studied recently by Adamczak et al (2018a) and Götze et al (2018). The question we study was raised by Marton (2003) in a closely related scenario.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, these works prove concentration bounds (which are qualitatively stronger than variance bounds) for multilinear functions of arbitrary degree d (rather than just bilinear functions, which are of degree d = 2). Further works sharpen these results both quantitatively (by narrowing the radius of concentration and weight of the tails) and qualitatively (with multilevel concentration results) [GSS18,AKPS18].…”
Section: Identity On Forestsmentioning
confidence: 68%
“…In particular, these works prove concentration bounds (which are qualitatively stronger than variance bounds) for multilinear functions of arbitrary degree d (rather than just bilinear functions, which are of degree d = 2). Further works sharpen these results both quantitatively (by narrowing the radius of concentration and weight of the tails) and qualitatively (with multilevel concentration results) [GSS18,AKPS19]. Even more recently, a line of study has investigated supervised learning problems under limited dependency between the samples (rather than the usual i.i.d.…”
Section: Testing Problemmentioning
confidence: 89%