_ This paper presents a distribution free multivariate Kolmogorov-Smirnov goodness of fit test. The test uses an statistic which is built using Rosenblatt's transformation and an algorithm is developed to compute it in the bivariate case. An approximate test, that can be easily computed in any dimension, is also presented. The power of these multivariate tests is studied in a simulation study.
A new portmanteau test for time series more powerful than the tests ofLjung and Box (1978) and Monti (1994} is proposed. The test is based on the pth root of the determinant of the pth autocorrelation matrix. It is shown that this statistic can be interpreted as the geometric mean of the squared multiple correlation coefficients with m lag values when m goes from 1 to p. It can also be interpreted as a geometric mean of the partial autocorrelation coefficients. The asymptotic distribution of the test statistic is obtained. This distribution is a linear combination of chi-squared distributions and it is shown that it can be approximated by a gamma distribution. The power of the test is compared with that of the Ljung and Box and Monti tests and it is shown that the proposed test can be up to 50% more powerful depending upon the model and sample size.The test is applied to the detection of nonlinearity by using the same matrix but with coefficients that are now the autocorrelations of the squared residuals. The new test is more powerful than the test of McLeod and Li (1983) (1978) and Monti (1994) is proposed. The test is based on the pth root of the determinant of the pth autocorrelation matrix. It is shown that this statistic can be interpreted as the geometric mean of the squared multiple correlation coefficients with m lag values when m goes from 1 to p. It can also be interpreted as a geometric mean of the partial autocorrelation coefficients. The asymptotic distribution of the test statistic is obtained. This distribution is a linear combination of chi-squared distributions and it is shown that it can be approximated by a gamma distribution. The power of the test is compared with that of the Ljung and Box and Monti tests and it is shown that the proposed test can be up to 50% more powerful depending upon the model and sample size.The test is applied to the detection of nonlinearity by using the same matrix but with coefficients that are now the autocorrelations of the squared residuals. The new test is more powerful than the test of McLeod and Li (1983) for nonlinearity. An example is presented in which this test detects nonlinearity in the residuals of the sunpot series.
In this article, we present a simple multivariate outlier-detection procedure and a robust estimator for the covariance matrix, based on the use of information obtained from projections onto the directions that maximize and minimize the kurtosis coef cient of the projected data. The properties of this estimator (computationa l cost, bias) are analyzed and compared with those of other robust estimators described in the literature through simulation studies. The performance of the outlier-detection procedure is analyzed by applying it to a set of well-known examples.KEY WORDS: Kurtosis; Linear projection; Multivariate statistics.The detection of outliers in multivariate data is recognized to be an important and dif cult problem in the physical, chemical, and engineering sciences. Whenever multiple measurements are obtained, there is always the possibility that changes in the measurement process will generate clusters of outliers. Most standard multivariate analysis techniques rely on the assumption of normality and require the use of estimates for both the location and scale parameters of the distribution. The presence of outliers may distort arbitrarily the values of these estimators and render meaningless the results of the application of these techniques. According to Rocke and Woodruff (1996), the problem of the joint estimation of location and shape is one of the most dif cult in robust statistics. Wilks (1963) proposed identifying sets of outliers of size j in normal multivariate data by checking the minimum values of the ratios -A 4I 5 -=-A-, where -A 4I 5 -is the internal scatter of a modi ed sample in which the set of observations I of size j has been deleted and -A-is the internal scatter of the complete sample. The internal scatter is proportional to the determinant of the covariance matrix and the ratios are computed for all possible sets of size j. Wilks computed the distribution of the statistic for j equal to 1 and 2. It is well known that this procedure is a likelihood ratio test and that for j D 1 the method is equivalent to selecting the observation with the largest Mahalanobis distance from the center of the data.Because a direct extension of this idea to sets of outliers larger than 2 or 3 is not practical, Gnanadesikan and Kettenring (1972) proposed to reduce the multivariate detection problem to a set of univariate problems by looking at projections of the data onto some direction. They chose the direction of maximum variability of the data and, therefore, they proposed to obtain the principal components of the data and search for outliers in these directions. Although this method provides the correct solution when the outliers are located close to the directions of the principal components, it may fail to identify outliers in the general case.An alternative approach is to use robust location and scale estimators. Maronna (1976) studied af nely equivariant M estimators for covariance matrices, and Campbell (1980) proposed using the Mahalanobis distance computed using M estimators for the mean...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.