2019
DOI: 10.1007/s10208-018-09408-6
|View full text |Cite
|
Sign up to set email alerts
|

On Polynomial Time Methods for Exact Low-Rank Tensor Completion

Abstract: In this paper, we investigate the sample size requirement for exact recovery of a high order tensor of low rank from a subset of its entries. We show that a gradient descent algorithm with initial value obtained from a spectral method can, in particular, reconstruct a d × d × d tensor of multilinear ranks (r, r, r) with high probability from as few as O(r 7/2 d 3/2 log 7/2 d + r 7 d log 6 d) entries. In the case when the ranks r = O(1), our sample size requirement matches those for nuclear norm minimization (Y… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
54
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 54 publications
(56 citation statements)
references
References 33 publications
1
54
1
Order By: Relevance
“…After the present paper was accepted for publication, and independently from our work, two groups posted related results on the tensor completion problem [19,27,28]. We then state our main results on tensor completion: Section 3 presents the unfolding-based algorithm, and Section 4 presents the more specialized algorithm for overcomplete 3-tensors.…”
Section: Organization Of the Papermentioning
confidence: 99%
See 1 more Smart Citation
“…After the present paper was accepted for publication, and independently from our work, two groups posted related results on the tensor completion problem [19,27,28]. We then state our main results on tensor completion: Section 3 presents the unfolding-based algorithm, and Section 4 presents the more specialized algorithm for overcomplete 3-tensors.…”
Section: Organization Of the Papermentioning
confidence: 99%
“…As noted above, for our unfolding algorithm we study the column spaces of partially revealed matrices with large aspect ratio; our results on this are presented in Section 6. After the present paper was accepted for publication, and independently from our work, two groups posted related results on the tensor completion problem [19,27,28].…”
Section: Organization Of the Papermentioning
confidence: 99%
“…The algorithm presented here is similar in spirit to those developed earlier by Keshavan et al (2010a, 2010b), Xia and Yuan (2019). A key difference is that we introduce an explicit rule of gradient descent update where each iteration on Grassmannians is calibrated with orthogonal rotations.…”
Section: Initial Estimatementioning
confidence: 88%
“…As discussed in several related work (e.g., [14,59,84,106,107]), once we obtain reliable estimates of the subspace spanned by the tensor factors, we can further exploit the tensor structure to estimate the unknown tensor. Indeed, in many tensor completion algorithms, subspace estimation serves as a crucial initial step for tensor completion.…”
Section: Corollary 42 (Symmetric Tensor Completion)mentioning
confidence: 99%
“…However, the above-mentioned approach might lead to suboptimal performance when the row dimension and the column dimension of the matrix differ dramatically. This issue has already been recognized in multiple contexts, including but not limited to unfolding-based spectral methods for tensor estimation [57,84,106,108,114] and spectral methods for biclustering [47]. Motivated by this suboptimality issue, an alternative is to look at the "sample Gram matrix" which, as one expects, shares the same leading left singular space as the original observed data matrix.…”
Section: Further Related Workmentioning
confidence: 99%