2018
DOI: 10.1002/cpa.21748
|View full text |Cite
|
Sign up to set email alerts
|

Spectral Algorithms for Tensor Completion

Abstract: In the tensor completion problem, one seeks to estimate a low-rank tensor based on a random sample of revealed entries. In terms of the required sample size, earlier work revealed a large gap between estimation with unbounded computational resources (using, for instance, tensor nuclear norm minimization) and polynomial-time algorithms. Among the latter, the best statistical guarantees have been proved, for third-order tensors, using the sixth level of the sum-ofsquares (SOS) semidefinite programming hierarchy.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
70
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 68 publications
(72 citation statements)
references
References 28 publications
2
70
0
Order By: Relevance
“…In contrast, our approach is essentially based on the spectral decomposition of a d × d matrix and can be computed fairly efficiently. Very recently, in independent work and under further restrictions on the tensor ranks, Montanari and Sun (2016) showed that a spectral method different from ours can also achieve consistency with O(d 3/2 polylog(r, log d)) observed entries. The rate of concentration for their estimate, however, is slower than ours and as a result, it is unclear if it provides a sufficiently accurate initial value for the exact recovery with the said sample size.…”
Section: Introductionmentioning
confidence: 80%
“…In contrast, our approach is essentially based on the spectral decomposition of a d × d matrix and can be computed fairly efficiently. Very recently, in independent work and under further restrictions on the tensor ranks, Montanari and Sun (2016) showed that a spectral method different from ours can also achieve consistency with O(d 3/2 polylog(r, log d)) observed entries. The rate of concentration for their estimate, however, is slower than ours and as a result, it is unclear if it provides a sufficiently accurate initial value for the exact recovery with the said sample size.…”
Section: Introductionmentioning
confidence: 80%
“…A large gap between known polynomial-time algorithms and statistical limits arises in the tensor completion problem, which shares many similarities with the spiked tensor model [GRY11,YZ15,MS16]. In the setting of tensor completion, hardness under Feige's hypothesis was proven in [BM16] for a certain regime of the number of observed entries.…”
Section: Introductionmentioning
confidence: 99%
“…Instead of exact recovery, their main result shows that O(I 3 2 r 2 lo 4 I ) observations could guarantee an approximation with an explicit upper bound on error. However, this sum-of-squares approach is point out not to be scale well to large tensors and is substituted by a spectral approach proposed by Montanari et al [137] with matching statistical guarantee (i.e., the required sample size). Recently, Yuan and Zhang [217] explore the correlations between tensor norms and di erent coherence conditions.…”
Section: Number Of Observed Entriesmentioning
confidence: 99%