Both principal components analysis (PCA) and orthogonal regression deal with finding a p-dimensional linear manifold minimizing a scale of the orthogonal distances of the m-dimensional data points to the manifold. The main conceptual difference is that in PCA p is estimated from the data, to attain a small proportion of unexplained variability, whereas in orthogonal regression p equals m − 1. The two main approaches to robust PCA are using the eigenvectors of a robust covariance matrix and searching for the projections that maximize or minimize a robust (univariate) dispersion measure. This article is more akin to second approach. But rather than finding the components one by one, we directly undertake the problem of finding, for a given p, a p-dimensional linear manifold minimizing a robust scale of the orthogonal distances of the data points to the manifold. The scale may be either a smooth M-scale or a "trimmed" scale. An iterative algorithm is developed that is shown to converge to a local minimum. A strategy based on random search is used to approximate a global minimum. The procedure is shown to be faster than other high-breakdown-point competitors, especially for large m. The case whereas p = m − 1 yields orthogonal regression. For PCA, a computationally efficient method to choose p is given. Comparisons based on both simulated and real data show that the proposed procedure is more robust than its competitors.KEY WORDS: High breakdown point; M-scale.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.