2009
DOI: 10.1007/s10208-009-9045-5
|View full text |Cite
|
Sign up to set email alerts
|

Exact Matrix Completion via Convex Optimization

Abstract: We consider a problem of considerable practical interest: the recovery of a data matrix from a sampling of its entries. Suppose that we observe m entries selected uniformly at random from a matrix M. Can we complete the matrix and recover the entries that we have not seen?We show that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries. We prove that if the number m of sampled entries obeys m ≥ C n 1.2 r log n for some positive numerical constant C, then with v… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

19
1,948
1
6

Year Published

2011
2011
2021
2021

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 3,921 publications
(2,048 citation statements)
references
References 34 publications
19
1,948
1
6
Order By: Relevance
“…In [20], Cand`es et al shows the number of sampled entries must be bigger than a constant, then the low-rank matrix can be perfectly recovered with high probability. Thus, in [2], for each unlabeled pixel, monochrome intensity affinities are used to find all its neighboring labeled pixels, and then it is labeled with the weighted sum of the neighbors' labels.…”
Section: Local Depth Consistency Interpolationmentioning
confidence: 99%
“…In [20], Cand`es et al shows the number of sampled entries must be bigger than a constant, then the low-rank matrix can be perfectly recovered with high probability. Thus, in [2], for each unlabeled pixel, monochrome intensity affinities are used to find all its neighboring labeled pixels, and then it is labeled with the weighted sum of the neighbors' labels.…”
Section: Local Depth Consistency Interpolationmentioning
confidence: 99%
“…Examples include matrix completion, regression with matrix covariates, and multivariate response regression. Matrix completion (Candès and Recht 2009;Mazumder, Hastie, and Tibshirani 2010) aims to recover a large matrix of which only a small fraction of entries are observed. The problem has sparked intensive research in recent years and is enjoying a broad range of applications such as personalized recommendation system (ACM SIGKDD and Netflix 2007) and imputation of massive genomics data (Chi, Zhou, Chen, Del Vecchyo, and Lange 2013).…”
Section: Introductionmentioning
confidence: 99%
“…For matrix models, another powerful non-convex constraint is the rank constraint [41]. By restricting matrices to have a rank of at most K, we in effect constrain the sparsity of the singular values.…”
Section: (C) Rank Constraintsmentioning
confidence: 99%