2021
DOI: 10.48550/arxiv.2104.14526
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation from Incomplete Measurements

Abstract: Tensors, which provide a powerful and flexible model for representing multi-attribute data and multiway interactions, play an indispensable role in modern data science across various fields in science and engineering. A fundamental task is to faithfully recover the tensor from highly incomplete measurements in a statistically and computationally efficient manner. Harnessing the low-rank structure of tensors in the Tucker decomposition, this paper develops a scaled gradient descent (ScaledGD) algorithm to direc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
6
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 82 publications
1
6
0
Order By: Relevance
“…The contract rate is free of the tensor condition number, improving over prior works (Jain and Oh, 2014;Cai et al, 2021a;Xia and Yuan, 2019) the iteration complexity for completing an ill-conditioned tensor. The attained iteration complexity matches the best one achieved by a recently proposed scaled gradient descent algorithm (Tong et al, 2021). Finally, inspired by the idea in Xia and Yuan (2019), we propose a novel initialization, based on a sequential second order moment method, as the input of RGrad algorithm for tensor completion.…”
Section: Our Contributionssupporting
confidence: 58%
See 2 more Smart Citations
“…The contract rate is free of the tensor condition number, improving over prior works (Jain and Oh, 2014;Cai et al, 2021a;Xia and Yuan, 2019) the iteration complexity for completing an ill-conditioned tensor. The attained iteration complexity matches the best one achieved by a recently proposed scaled gradient descent algorithm (Tong et al, 2021). Finally, inspired by the idea in Xia and Yuan (2019), we propose a novel initialization, based on a sequential second order moment method, as the input of RGrad algorithm for tensor completion.…”
Section: Our Contributionssupporting
confidence: 58%
“…The recursive nature of computing a TT decomposition substantially complicates the theoretical analysis, involving repeated appearances of the incoherence parameters and TT rank. Interestingly, the required number of iterations l max is free of the condition number, which often appears in decomposition-based algorithms (Cai et al, 2021a;Han et al, 2020), except the recently proposed scaled gradient descent algorithm (Tong et al, 2021).…”
Section: Exact Recovery and Convergence Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…It is also worth mentioning that there are several recent results that directly solve the tabular tensor decomposition by (accelerated) gradient method with convergence guarantee (Cai et al, 2019;Han et al, 2021;Tong et al, 2021). Such methods usually have an advantage over power iteration as they directly aim at the optimization of likelihood, and can successfully remove the effect of incoherence from the final estimation error bound.…”
Section: Discussionmentioning
confidence: 99%
“…Motivated by real data applications in practice, related statistical models were further studied under non-Gaussian or missing-data scenarios. For example, Hong et al (2020); Han et al (2021) considered the likelihood-based generalized low-rank tensor decomposition under general exponential family; Cai et al (2019); Tong et al (2021) studied the low-rank tensor estimation under incomplete measurements.…”
Section: Related Workmentioning
confidence: 99%