2017
DOI: 10.48550/arxiv.1702.05571
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Thresholding based Efficient Outlier Robust PCA

Abstract: We consider the problem of outlier robust PCA (OR-PCA) where the goal is to recover principal directions despite the presence of outlier data points. That is, given a data matrix M * , where (1 − α) fraction of the points are noisy samples from a low-dimensional subspace while α fraction of the points can be arbitrary outliers, the goal is to recover the subspace accurately. Existing results for OR-PCA have serious drawbacks: while some results are quite weak in the presence of noise, other results have runtim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(12 citation statements)
references
References 6 publications
0
12
0
Order By: Relevance
“…Closing the gap in robust subspace estimation. [74,28,75,19] study robust PCA under the Gaussian assumption. For the reasons explained in §2.2, the rate is sub-optimal in α in comparisons to an information theoretic lower bound with a multiplicative factor of (1 − Θ(α)).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Closing the gap in robust subspace estimation. [74,28,75,19] study robust PCA under the Gaussian assumption. For the reasons explained in §2.2, the rate is sub-optimal in α in comparisons to an information theoretic lower bound with a multiplicative factor of (1 − Θ(α)).…”
Section: Discussionmentioning
confidence: 99%
“…Closing the gap in Outlier-Robust PCA (ORPCA). [74,28,75,19] study robust PCA under the assumption that each sample z i = Ax i + v i where x i , v i are drawn from isotropic Gaussian distribution, and the goal is to learn the top-k eigenspace of AA . When n samples are observed, α fraction of which are corrupted by an adversary, [74] introduces a filtering algorithm to find a subspace U achieving:…”
Section: Proof Sketch Of the Adaption To Exponential Tail Settingmentioning
confidence: 99%
“…We now compare with the existing methods that are provably tolerable to the outliers. The methods are covered by a recent review in (Lerman and Maunu, 2018, Table I), including the Geodesic Gradient Descent (GGD) , Fast Median Subspace (FMS) , REAPER (Lerman et al, 2015a), Geometric Median Subspace (GMS) , 2,1 -RPCA (Xu et al, 2010) (which is called Outlier Pursuit (OP) in (Lerman and Maunu, 2018, Table I)), Tyler M-Estimator (TME) (Zhang, 2016), Thresholding-based Outlier Robust PCA (TORP) (Cherapanamjeri et al, 2017) and the Coherence Pursuit (CoP) (Rahmani and Atia, 2016). However, we note that the comparison maynot be very fair since the results summarized in (Lerman and Maunu, 2018, Table I) are established for random Gaussian models where the columns of O and X are drawn independently and uniformly at random from the distribtuion N (0, 1 D I) and N (0, 1 d SS T ) with S ∈ R D×d being an orthonormal basis of the inlier subspace S. Nevertheless, these two random models are closely related since each columns of O or X in the random Gaussian model is also concentrated around the sphere S D−1 , especially when d is large.…”
Section: Comparison With Existing Resultsmentioning
confidence: 99%
“…We note that the literature on this subject is vast and the above mentioned related work is far from exhaustive; other methods such as the Fast Median Subspace (FMS) , Geometric Median Subspace (GMS) , Tyler M-Estimator (TME) (Zhang, 2016), Thresholding-based Outlier Robust PCA (TORP) (Cherapanamjeri et al, 2017), expressing each data point as a sparse linear combination of other data points (Soltanolkotabi et al, 2014;You et al, 2017), online subspace learning (Balzano et al, 2010), etc. For many other related methods, we refer to a recent review article (Lerman and Maunu, 2018) that thoroughly summarizes the entire body of work on robust subspace recovery.…”
Section: Related Workmentioning
confidence: 99%
“…Recent works have attempted to develop simple non iterative algorithms for robust PCA with the outlier model [5]. Other methods which aims at solving robust PCA through this model include [26], [27] and works based on thresholding like [28], [25]. Most of the algorithms proposed are either iterative and complex and/or would require the knowledge of either the outlier fraction or the dimension of the low rank subspace or would have free parameters that needs to be set according to the data statistics.…”
Section: Introductionmentioning
confidence: 99%