2016
DOI: 10.48550/arxiv.1606.07315
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Nearly-optimal Robust Matrix Completion

Abstract: In this paper, we consider the problem of Robust Matrix Completion (RMC) where the goal is to recover a low-rank matrix by observing a small number of its entries out of which a few can be arbitrarily corrupted. We propose a simple projected gradient descent method to estimate the low-rank matrix that alternately performs a projected gradient descent step and cleans up a few of the corrupted entries using hard-thresholding. Our algorithm solves RMC using nearly optimal number of observations as well as nearly … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
22
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(23 citation statements)
references
References 8 publications
1
22
0
Order By: Relevance
“…When considering the noiseless case, only the optimization error term exists. It is worth noting that our robustness guarantee required for the gradient descent phase matches the best-known results O(1/r) in Hsu et al (2011); Chen et al (2013); Cherapanamjeri et al (2016).…”
Section: Results For the Generic Modelsupporting
confidence: 61%
See 3 more Smart Citations
“…When considering the noiseless case, only the optimization error term exists. It is worth noting that our robustness guarantee required for the gradient descent phase matches the best-known results O(1/r) in Hsu et al (2011); Chen et al (2013); Cherapanamjeri et al (2016).…”
Section: Results For the Generic Modelsupporting
confidence: 61%
“…• The gradient descent phase of our proposed algorithm matches the best-known robustness guarantee O(1/r) (Hsu et al, 2011;Chen et al, 2013). Compared with existing robust PCA algorithms (Yi et al, 2016;Cherapanamjeri et al, 2016), our algorithm achieves improved computational complexity O r 3 d log d log(1/ ) , while matching the optimal sample complexity O(r 2 d log d) for Burer-Monteiro factorization-based low-rank matrix recovery (Zheng and Lafferty, 2016) under the incoherence condition.…”
Section: Introductionmentioning
confidence: 68%
See 2 more Smart Citations
“…The remainder of this paper is organized as follows: in Section 3, we review the problem formulation in detail. We present the algorithm in Section 4, and the main theory in Section 2016), robust low-rank matrix completion (Agarwal et al, 2012;Goldfarb and Qin, 2014;Klopp et al, 2014;Cherapanamjeri et al, 2016), robust tensor decomposition (Gu et al, 2014;Anandkumar et al, 2015). The key idea of these methods is to estimate the unknown low-rank matrix/tensor and the sparse corruption matrix/tensor simultaneously.…”
Section: Introductionmentioning
confidence: 99%