2018
DOI: 10.1016/bs.hna.2018.09.001
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting the Structure Effectively and Efficiently in Low-Rank Matrix Recovery

Abstract: Low rank model arises from a wide range of applications, including machine learning, signal processing, computer algebra, computer vision, and imaging science. Low rank matrix recovery is about reconstructing a low rank matrix from incomplete measurements. In this survey we review recent developments on low rank matrix recovery, focusing on three typical scenarios: matrix sensing, matrix completion and phase retrieval. An overview of effective and efficient approaches for the problem is given, including nuclea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(8 citation statements)
references
References 101 publications
(196 reference statements)
0
8
0
Order By: Relevance
“…For more general low rank matrix recovery, a variety of nonconvex algorithms have been developed and analyzed, including those based on matrix factorization [38,50] and those based on the embedded manifold of low rank matrices [45,44]. The reader can refer to the review paper [8] for more details. Geometric landscape of related loss functions for low rank matrix recovery has been investigated in [18,19,28,4,32].…”
Section: Introductionmentioning
confidence: 99%
“…For more general low rank matrix recovery, a variety of nonconvex algorithms have been developed and analyzed, including those based on matrix factorization [38,50] and those based on the embedded manifold of low rank matrices [45,44]. The reader can refer to the review paper [8] for more details. Geometric landscape of related loss functions for low rank matrix recovery has been investigated in [18,19,28,4,32].…”
Section: Introductionmentioning
confidence: 99%
“…Many progresses in this topic were made for the low-rank matrix estimation (Keshavan et al, 2009;Boumal andAbsil, 2011, 2015;Wei et al, 2016;Meyer et al, 2011;Mishra et al, 2014;Vandereycken, 2013;Huang and Hand, 2018;Cherian and Sra, 2016;Luo et al, 2020). See the recent survey on this line of work at Cai and Wei (2018); Uschmajew and Vandereycken (2020). Moreover, Riemannian manifold optimization method has been applied for various problems on low-rank tensor estimation, such as tensor regression (Cai et al, 2020;Kressner et al, 2016), tensor completion (Rauhut et al, 2015;Kasai and Mishra, 2016;Dong et al, 2021;Kressner et al, 2014;Heidel and Schulz, 2018;Xia and Yuan, 2017;Steinlechner, 2016;Da Silva and Herrmann, 2015), and robust tensor PCA (Cai et al, 2021).…”
Section: Related Literaturementioning
confidence: 99%
“…First, from an algorithmic perspective, a number of algorithms, including the penalty approaches, gradient descent, alternating minimization, Gauss-Newton, have been developed either for solving the manifold formulation [BPS20, GS10, BA11, MMBS14, MBS11, MMBS14, Van13, HH18, LHLZ20] or the factorization formulation [CLS15, JNS13, SL15, TD21, TBS `16, WYZ12,BNZ21]. We refer readers to [CLC19,CW18a] for the recent algorithmic development under two formulations. Many algorithms developed under the manifold formulation involve Riemannian optimization techniques and can be more complex than the ones developed under the factorization formulation.…”
Section: Related Literaturementioning
confidence: 99%
“…Note that (3), (4) and ( 5) are unconstrained, and thus can be tackled by running unconstrained optimization algorithms. Indeed, under proper assumptions, a number of algorithms with theoretical guarantees have been proposed for both the manifold and the factorization formulations [CLC19,CW18a]. See Section 1.2 for a review of existing results.…”
Section: Introductionmentioning
confidence: 99%