A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregressive time series models. The correction is of particular use when the sample size is small, or when the number of fitted parameters is a moderate to large fraction of the sample size. The corrected method, called AIC C , is asymptotically efficient if the true model is infinite dimensional. Furthermore, when the true model is of finite dimension, AIC C is found to provide better model order choices than any other asymptotically efficient method. Applications to nonstationary autoregressive and mixed autoregressive moving average time series models are also discussed.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.. Biometrika Trust is collaborating with JSTOR to digitize, preserve and extend access to Biometrika. SUMMARY A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregressive time series models. The correction is of particular use when the sample size is small, or when the number of fitted parameters is a moderate to large fraction of the sample size. The corrected method, called AICC, is asymptotically efficient if the true model is infinite dimensional. Furthermore, when the true model is of finite dimension, AICC is found to provide better model order choices than any other asymptotically efficient method. Applications to nonstationary autoregressive and mixed autoregressive moving average time series models are also discussed.
Many different methods have been proposed to construct nonparametric estimates of a smooth regression function, including local polynomial, (convolution) kernel and smoothing spline estimators. Each of these estimators uses a smoothing parameter to control the amount of smoothing performed on a given data set. In this paper an improved version of a criterion based on the Akaike information criterion (AIC), termed AIC C , is derived and examined as a way to choose the smoothing parameter. Unlike plug-in methods, AIC C can be used to choose smoothing parameters for any linear smoother, including local quadratic and smoothing spline estimators. The use of AIC C avoids the large variability and tendency to undersmooth (compared with the actual minimizer of average squared error) seen when other`classical' approaches (such as generalized cross-validation or the AIC) are used to choose the smoothing parameter. Monte Carlo simulations demonstrate that the AIC C -based smoothing parameter is competitive with a plug-in method (assuming that one exists) when the plug-in method works well but also performs well when the plug-in approach fails or is unavailable.
We establish some asymptotic properties of a log-periodogram regression estimator for the memory parameter of a long-memory time series. We consider the estimator originally proposed by Geweke and Porter-Hudak (The estimation and application of long memory time series models. J. Time Ser. Anal. 4 (1983), 221±37). In particular, we do not omit any of the low frequency periodogram ordinates from the regression. We derive expressions for the estimator's asymptotic bias, variance and mean squared error as functions of the number of periodogram ordinates, m, used in the regression. Consistency of the estimator is obtained as long as m 3 I and n 3 I with (m log m)an 3 0, where n is the sample size. Under these and the additional conditions assumed in this paper, the optimal m, minimizing the mean squared error, is of order O(n 4a5 ). We also establish the asymptotic normality of the estimator. In a simulation study, we assess the accuracy of our asymptotic theory on mean squared error for ®nite sample sizes. One ®nding is that the choice m n 1a2 , originally suggested by Geweke and Porter-Hudak (1983), can lead to performance which is markedly inferior to that of the optimal choice, even in reasonably small samples.
Standard predictive regressions produce biased coefficient estimates in small samples when the regressors are Gaussian first-order autoregressive with errors that are correlated with the error series of the dependent variable. See Stambaugh (1999) for the single regressor model. This paper proposes a direct and convenient method to obtain reduced-bias estimators for single and multiple regressor models by employing an augmented regression, adding a proxy for the errors in the autoregressive model. We derive bias expressions for both the ordinary least-squares and our reduced-bias estimated coefficients. For the standard errors of the estimated predictive coefficients, we develop a heuristic estimator that performs well in simulations, for both the single predictor model and an important specification of the multiple predictor model. The effectiveness of our method is demonstrated by simulations and empirical estimates of common predictive models in finance. Our empirical results show that some of the predictive variables that were significant under ordinary least squares become insignificant under our estimation procedure.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.