2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS) 2016
DOI: 10.1109/focs.2016.54
|View full text |Cite
|
Sign up to set email alerts
|

Polynomial-Time Tensor Decompositions with Sum-of-Squares

Abstract: We give new algorithms based on the sum-of-squares method for tensor decomposition. Our results improve the best known running times from quasi-polynomial to polynomial for several problems, including decomposing random overcomplete 3-tensors and learning overcomplete dictionaries with constant relative sparsity. We also give the first robust analysis for decomposing overcomplete 4-tensors in the smoothed analysis model.A key ingredient of our analysis is to establish small spectral gaps in moment matrices der… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
104
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 62 publications
(104 citation statements)
references
References 27 publications
(60 reference statements)
0
104
0
Order By: Relevance
“…Initiated by the work of [BBH + 12], who showed that polynomial time algorithms in the hierarchy solve all known integrality gap instances for Unique Games and related problems, a steady stream of works have developed efficient algorithms for both worst-case [BKS14, BKS15, BKS17, BGG + 16] and average-case problems [HSS15,GM15,BM16,RRS16,BGL16,MSS16a,PS17]. The insights from these works extend beyond individual algorithms to characterizations of broad classes of algorithmic techniques.…”
Section: Introductionmentioning
confidence: 99%
“…Initiated by the work of [BBH + 12], who showed that polynomial time algorithms in the hierarchy solve all known integrality gap instances for Unique Games and related problems, a steady stream of works have developed efficient algorithms for both worst-case [BKS14, BKS15, BKS17, BGG + 16] and average-case problems [HSS15,GM15,BM16,RRS16,BGL16,MSS16a,PS17]. The insights from these works extend beyond individual algorithms to characterizations of broad classes of algorithmic techniques.…”
Section: Introductionmentioning
confidence: 99%
“…The remarkable result of Kruskal [Kru77] shows that for s > 2, the decomposition in "typically" unique, as long as R is a small enough. Several works [Har70, Car91, AGH + 12, MSS16] have designed efficient recovery algorithms in different regimes of R, and assumptions on {A i }. The other important question is if the {A i } can be recovered assuming that we only have access to M s + Err, for some noise tensor Err.…”
Section: Overcomplete Tensor Decompositionsmentioning
confidence: 99%
“…Similarly, parameter estimation for basic latent variable models like mixtures of spherical Gaussians has exponential sample complexity in the worst case [MV10]; yet, polynomial time guarantees can be obtained using smoothed analysis, where the parameters (e.g., means for Gaussians) are randomly perturbed in high dimensions [HK12, BCMV14, ABG + 14, GHK15]. 1 Smoothed analysis results have also been obtained for other problems like overcomplete ICA [GVX14], learning mixtures of general Gaussians [GHK15], fourth-order tensor decompositions [MSS16], and recovering assemblies of neurons [ADM + 18].…”
Section: Introductionmentioning
confidence: 99%
“…where the a i ∈ R p are drawn independently from N (0, I/p). The state-of-the-art results for this problem are a close-to-linear-time spectral method that succeeds when r p 4/3 [HSSS16] and a polynomial-time sum-of-squares method that succeeds when r p 3/2 [MSS16]. (It seems likely that no efficient algorithm can succeed when r exceeds p 3/2 .…”
Section: Summary Of Techniquesmentioning
confidence: 99%
“…In the overcomplete case, a line of work has culminated in a polynomial time algorithm that works when the vectors a i are random (i.i.d. Gaussian) and r p 3/2 [GM15, HSSS16,MSS16]. In applications, the vectors a 1 , .…”
Section: Introductionmentioning
confidence: 99%