2018
DOI: 10.5705/ss.202016.0205
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic theory for estimating the singular vectors and values of a partially-observed low rank matrix with noise

Abstract: Matrix completion algorithms recover a low rank matrix from a small fraction of the entries, each entry contaminated with additive errors. In practice, the singular vectors and singular values of the low rank matrix play a pivotal role for statistical analyses and inferences. This paper proposes estimators of these quantities and studies their asymptotic behavior. Under the setting where the dimensions of the matrix increase to infinity and the probability of observing each entry is identical, Theorem 1 gives … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(27 citation statements)
references
References 30 publications
0
27
0
Order By: Relevance
“…Matrix completion, whose central goal is to recover a large low-rank matrix based on a limited number of observable entries, has been widely studied in the last decade. Among various methods for matrix completion, spectral method is fast, easy to implement and achieves good performance (Keshavan et al, 2010;Chatterjee, 2014;Cho et al, 2015). The new perturbation bounds can be potentially used for singular space estimation under the matrix completion setting to yield better results.…”
Section: Discussionmentioning
confidence: 99%
“…Matrix completion, whose central goal is to recover a large low-rank matrix based on a limited number of observable entries, has been widely studied in the last decade. Among various methods for matrix completion, spectral method is fast, easy to implement and achieves good performance (Keshavan et al, 2010;Chatterjee, 2014;Cho et al, 2015). The new perturbation bounds can be potentially used for singular space estimation under the matrix completion setting to yield better results.…”
Section: Discussionmentioning
confidence: 99%
“…This means that a non-vanishing proportion of entries of M 0 contains non-vanishing signals with dimensionality (see Fan et al (2013)). For more discussion, see Remark 2 in Cho et al (2016).…”
Section: Initializationmentioning
confidence: 99%
“…Lemma 2 in Cho et al (2016) provides justification of using the scree plot and the singular value gap to choose the rank. For the thresholding parameters for softImpute-type algorithms, we chose the optimal values which result in the smallest test errors.…”
Section: A Real Data Examplementioning
confidence: 99%
“…Inspired by unobserved entries fill strategy 21,22 and matrix completion based domain adaptation [23][24][25] , we propose to combine the deep learning network with the matrix completion algorithm 15 to develop our DLMC method for efficient NV spectrum map reconstruction. DL can learn very complex non-linear mapping from a partially filled spectrum map to its full-resolution map, with the DL network trained with simulation data; while the traditional matrix completion (MC) method is used for post-processing the DL output map to keep its low-rank property, thus further alleviate the domain shift problem.…”
Section: Introductionmentioning
confidence: 99%