This article shows that the relationship between kurtosis, persistence of shocks to volatility, and first-order autocorrelation of squares is different in GARCH and ARSV models. This difference can explain why, when these models are fitted to the same series, the persistence estimated is usually higher in GARCH than in ARSV models, and, why gaussian ARSV models seem to be adequate, whereas GARCH models often require leptokurtic conditional distributions. We also show that introducing the asymmetric response of volatility to positive and negative returns does not change the conclusions. These results are illustrated with the analysis of daily financial returns.
In this paper we propose bootstrap methods for constructing nonparametric prediction intervals for a general class of linear processes. Our approach uses the AR(∞)-sieve bootstrap procedure based on residual resampling from an autoregressive approximation to the given process. We present a Monte Carlo study comparing the ÿnite sample properties of the sieve bootstrap with those of alternative methods. Finally, we illustrate the performance of the proposed method with a real data example.MSC: 62M10; 62M20; 62G09
We propose a procedure for computing a fast approximation to regression estimates based on the minimization of a robust scale. The procedure can be applied with a large number of independent variables where the usual algorithms require an unfeasible or extremely costly computer time. Also, it can be incorporated in any high-breakdown estimation method and may improve it with just little additional computer time. The procedure minimizes the robust scale over a set of tentative parameter vectors estimated by least squares after eliminating a set of possible outliers, which are obtained as follows. We represent each observation by the vector of changes of the least squares forecasts of the observation when each of the data points is deleted. Then we obtain the sets of possible outliers as the extreme points in the principal components of these vectors, or as the set of points with large residuals. The good performance of the procedure allows identification of multiple outliers, avoiding masking effects. We investigate the procedure's efficiency for robust estimation and power as an outlier detection tool in a large real dataset and in a simulation study.
This review article looks at a small part of the picture of the interrelationship between statistical theory and computational algorithms, especially the Gibbs sampler and the Accept-Reject algorithm. We pay particular attention to how the methodologies affect and complement each other.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.