2019
DOI: 10.1093/imaiai/iaz009
|View full text |Cite
|
Sign up to set email alerts
|

Non-convex low-rank matrix recovery with arbitrary outliers via median-truncated gradient descent

Abstract: Recent work has demonstrated the effectiveness of gradient descent for directly recovering the factors of low-rank matrices from random linear measurements in a globally convergent manner when initialized properly. However, the performance of existing algorithms is highly sensitive in the presence of outliers that may take arbitrary values. In this paper, we propose a truncated gradient descent algorithm to improve the robustness against outliers, where the truncation is performed to rule out the contributions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
41
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 34 publications
(42 citation statements)
references
References 42 publications
1
41
0
Order By: Relevance
“…in the robust phase retrieval problem (89), the spectral method might not work properly even with the presence of a single outlier whose magnitude can be arbitrarily large to perturb the leading eigenvector of Y . To mitigate this issue, a median-truncation scheme was proposed in [24,88], where…”
Section: Truncated Spectral Methods For Removing Sparse Outliersmentioning
confidence: 99%
See 3 more Smart Citations
“…in the robust phase retrieval problem (89), the spectral method might not work properly even with the presence of a single outlier whose magnitude can be arbitrarily large to perturb the leading eigenvector of Y . To mitigate this issue, a median-truncation scheme was proposed in [24,88], where…”
Section: Truncated Spectral Methods For Removing Sparse Outliersmentioning
confidence: 99%
“…There are two common purposes for enforcing a truncation step: (1) to remove samples whose associated design vectors are too coherent with the current iterate [20,86,87], in order to accelerate convergence and improve sample complexity; (2) to remove samples that may be adversarial outliers, in the hope of improving robustness of the algorithm [24,64,88].…”
Section: Truncated Gradient Descentmentioning
confidence: 99%
See 2 more Smart Citations
“…However, these methods typically require performing an SVD in each iteration for projection onto the set of low-rank matrices. Recently, a median-truncated gradient descent method has been proposed in [31] to tackle (1.2), where the gradient is modified to alleviate the effect of outliers. The median-truncated gradient descent is shown to have a local linear convergence rate [31], but such guarantee requires m nr log n measurements.…”
Section: Relatedmentioning
confidence: 99%