“…As in [19], we prove concentration bounds not only for Birkhoff sums, but for a more general class of separately Lipschitz (or separately Hölder) functions on [0, 1] N , see Theorem 3.11 and Remark 3.3. Theorem 1.2 improves the moment bounds in Nicol, Pereira and Török [33] and Su [41], and implies the following bounds on large and moderate deviations: Corollary 2.1. In the notation of Theorem 1.2, for every p > 2,…”
Section: Discussionmentioning
confidence: 56%
“…Also, we found that for γ * ∈ (0, 1/2), Theorem 1.2 can be proved using memory loss with asymptotics O(n −1/γ * +1 ) as in Theorem 1.1(a), and close to optimal results can be obtained with the slightly weaker bound O n −1/γ * +1 (log n) 1/γ * from [1], as it is done in [33]. For γ * ∈ (1/2, 1) the situation is significantly more complicated.…”
Section: 3mentioning
confidence: 58%
“…for each p > max{1, 1/γ * − 1} from [33,Theorem 4.1]. We remove the logarithmic term, get a better power of ε when γ * ∈ (1/2, 1) and allow the observables v n to depend on n.…”
Section: Discussionmentioning
confidence: 99%
“…Alternatively, one can use the moment bounds from [33] or [41], but these were not available when we started this project.…”
Section: 3mentioning
confidence: 99%
“…We study limit laws for nonstationary dynamical systems, a topic of very intense recent interest, see [2,3,4,5,9,13,14,15,16,20,22,23,29,33,34,38,40,41] and more.…”
We study nonstationary intermittent dynamical systems, such as compositions of a (deterministic) sequence of Pomeau-Manneville maps. We prove two main results: sharp bounds on memory loss, including the "unexpected" faster rate for a large class of measures, and sharp moment bounds for Birkhoff sums and, more generally, "separately Hölder" observables.
“…As in [19], we prove concentration bounds not only for Birkhoff sums, but for a more general class of separately Lipschitz (or separately Hölder) functions on [0, 1] N , see Theorem 3.11 and Remark 3.3. Theorem 1.2 improves the moment bounds in Nicol, Pereira and Török [33] and Su [41], and implies the following bounds on large and moderate deviations: Corollary 2.1. In the notation of Theorem 1.2, for every p > 2,…”
Section: Discussionmentioning
confidence: 56%
“…Also, we found that for γ * ∈ (0, 1/2), Theorem 1.2 can be proved using memory loss with asymptotics O(n −1/γ * +1 ) as in Theorem 1.1(a), and close to optimal results can be obtained with the slightly weaker bound O n −1/γ * +1 (log n) 1/γ * from [1], as it is done in [33]. For γ * ∈ (1/2, 1) the situation is significantly more complicated.…”
Section: 3mentioning
confidence: 58%
“…for each p > max{1, 1/γ * − 1} from [33,Theorem 4.1]. We remove the logarithmic term, get a better power of ε when γ * ∈ (1/2, 1) and allow the observables v n to depend on n.…”
Section: Discussionmentioning
confidence: 99%
“…Alternatively, one can use the moment bounds from [33] or [41], but these were not available when we started this project.…”
Section: 3mentioning
confidence: 99%
“…We study limit laws for nonstationary dynamical systems, a topic of very intense recent interest, see [2,3,4,5,9,13,14,15,16,20,22,23,29,33,34,38,40,41] and more.…”
We study nonstationary intermittent dynamical systems, such as compositions of a (deterministic) sequence of Pomeau-Manneville maps. We prove two main results: sharp bounds on memory loss, including the "unexpected" faster rate for a large class of measures, and sharp moment bounds for Birkhoff sums and, more generally, "separately Hölder" observables.
Liverani–Saussol–Vaienti (L–S–V) maps form a family of piecewise differentiable dynamical systems on [0, 1] depending on one parameter
ω
∈
R
+
. These maps are everywhere expanding apart from a neutral fixed point. It is well known that depending on the amount of expansion close to the neutral point, they have either an absolutely continuous invariant probability measure and polynomial decay of correlations (ω < 1), or a unique physical measure that is singular and concentrated at the neutral point (ω > 1). In this paper, we study the composition of L–S–V maps whose parameters are randomly sampled from a range in
R
+
, and where these two contrasting behaviours are mixed. We show that if the parameters ω < 1 are sampled with positive probability, then the stationary measure of the random system is absolutely continuous; the annealed decay rate of correlations is close (or in some cases equal) to the fastest rate of decay among those of the sampled systems; and suitably rescaled Birkhoff averages converge to limit laws. In contrast to previous studies where ω ∈ [0, 1], we allow ω > 1 in our sampling distribution. We also show that one can obtain similar decay of correlation rates for
ω
∈
R
+
, when sampling is done with respect to a family of smooth, heavy-tailed distributions.
Liverani-Saussol-Vaienti (L-S-V) maps form a family of piecewise differentiable dynamical systems on [0, 1] depending on one parameter ω ∈ R + . These maps are everywhere expanding apart from a neutral fixed point. It is well known that depending on the amount of expansion close to the neutral point, they have either an absolutely continuous invariant probability measure and polynomial decay of correlations (ω < 1), or a unique physical measure that is singular and concentrated at the neutral point (ω > 1). In this paper, we study the composition of L-S-V maps whose parameters are randomly sampled from a range in R + , and where these two contrasting behaviours are mixed. We show that if the parameters ω < 1 are sampled with positive probability, then the stationary measure of the random system is absolutely continuous; the annealed decay rate of correlations is close (or in some cases equal) to the fastest rate of decay among those of the sampled systems; and suitably rescaled Birkhoff averages converge to limit laws. In contrast to previous studies where ω ∈ [0, 1], we allow ω > 1 in our sampling distribution. We also show that one can obtain similar decay of correlation rates for ω ∈ [0, ∞), when sampling is done with respect to a family of smooth, heavy-tailed distributions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.