We establish both uniform and nonuniform error bounds of the Berry-Esseen
type in normal approximation under local dependence. These results are of an
order close to the best possible if not best possible. They are more general or
sharper than many existing ones in the literature. The proofs couple Stein's
method with the concentration inequality approach.Comment: Published by the Institute of Mathematical Statistics
(http://www.imstat.org) in the Annals of Probability
(http://www.imstat.org/aop/) at http://dx.doi.org/10.1214/00911790400000045
In statistical analyses the complexity of a chosen model is often related to the size of available data. One important question is whether the asymptotic distribution of the parameter estimates normally derived by taking the sample size to infinity for a fixed number of parameters would remain valid if the number of parameters in the model actually increases with the sample size. A number of authors have addressed this question for the linear models. The component-wise asymptotic normality of the parameter estimate remains valid if the dimension of the parameter space grows more slowly than some root of the sample size. In this paper, we consider M-estimators of general parametric models. Our results apply to not only linear regression but also other estimation problems such as multivariate location and generalized linear models. Examples are given to illustrate the applications in different settings.
Academic PressAMS 1991 subject classifications: primary 62F12, 62J05; secondary 60F15, 62F35.
We develop a testing procedure for distinguishing between a long-range
dependent time series and a weakly dependent time series with change-points in
the mean. In the simplest case, under the null hypothesis the time series is
weakly dependent with one change in mean at an unknown point, and under the
alternative it is long-range dependent. We compute the CUSUM statistic $T_n$,
which allows us to construct an estimator $\hat{k}$ of a change-point. We then
compute the statistic $T_{n,1}$ based on the observations up to time $\hat{k}$
and the statistic $T_{n,2}$ based on the observations after time $\hat{k}$. The
statistic $M_n=\max[T_{n,1},T_{n,2}]$ converges to a well-known distribution
under the null, but diverges to infinity if the observations exhibit long-range
dependence. The theory is illustrated by examples and an application to the
returns of the Dow Jones index.Comment: Published at http://dx.doi.org/10.1214/009053606000000254 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.