2014
DOI: 10.1239/jap/1421763330
|View full text |Cite
|
Sign up to set email alerts
|

Uniform Chernoff and Dvoretzky-Kiefer-Wolfowitz-Type Inequalities for Markov Chains and Related Processes

Abstract: We observe that the technique of Markov contraction can be used to establish measure concentration for a broad class of non-contracting chains. In particular, geometric ergodicity provides a simple and versatile framework. This leads to a short, elementary proof of a general concentration inequality for Markov and hidden Markov chains (HMM), which supercedes some of the known results and easily extends to other processes such as Markov trees. As applications, we give a Dvoretzky-Kiefer-Wolfowitz-type inequalit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
29
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 20 publications
(33 citation statements)
references
References 32 publications
4
29
0
Order By: Relevance
“…Note that a similar bound was obtained in Kontorovich and Weiss (2012). The main advantage of Proposition 2.19 is that the constants in the exponent of our inequality are proportional to the mixing time of the chain.…”
Section: Applicationssupporting
confidence: 73%
“…Note that a similar bound was obtained in Kontorovich and Weiss (2012). The main advantage of Proposition 2.19 is that the constants in the exponent of our inequality are proportional to the mixing time of the chain.…”
Section: Applicationssupporting
confidence: 73%
“…Now, taking g = I G we will arrive at (6.6). In fact, in our circumstances we can obtain (6.6) directly from [21] and [20]. Now we conclude the proof of (6.5) and of the whole Lemma 4.1 (in the setup of Theorem 2.7) in the same way as in the previous section.…”
Section: Finite State Space Casesupporting
confidence: 74%
“…where we have used the trivial bound E π f 2 ≤ 1 to simplify the inequalities. Note that this yields an improvement over d 2 for j log d. Moreover, the bound (19) can itself be improved, since each f j is orthogonal to all eigenfunctions other than 1 and g j , so that the log d factors can all be removed by a more carefully argued form of Lemma 1. It thus follows directly from the bound (18) that if we draw N +T f j 2 samples, we obtain the tail bound…”
Section: Example: Lazy Random Walk On C 2dmentioning
confidence: 98%
“…The requirement that the chain start in equilibrium can be relaxed by adding a correction for the burn-in time [27]. Extensions of this and related bounds, including bounded-differences-type inequalities and generalizations to continuous Markov chains and non-Markov mixing processes have also appeared in the literature (e.g., [19,29]). The concentration result has an alternative formulation in terms of the mixing time instead of the spectral gap [4].…”
Section: Related Workmentioning
confidence: 99%