2021
DOI: 10.48550/arxiv.2103.08895
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generalized Low-rank plus Sparse Tensor Estimation by Fast Riemannian Optimization

Abstract: We investigate a generalized framework to estimate a latent low-rank plus sparse tensor, where the low-rank tensor often captures the multi-way principal components and the sparse tensor accounts for potential model mis-specifications or heterogeneous signals that are unexplainable by the low-rank part. The framework is flexible covering both linear and non-linear models, and can easily handle continuous or categorical variables. We propose a fast algorithm by integrating the Riemannian gradient descent and a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(17 citation statements)
references
References 63 publications
0
17
0
Order By: Relevance
“…Second, throughout the applications, we assume the noise is Gaussian distributed. In the scenarios that the noise is heavy-tailed or the data have outliers (Cai et al, 2021), we would like to consider using the robust loss (e.g., l 1 loss or Huber loss) in (18) instead of the l 2 loss or consider quantile tensor regression (Lu et al, 2020). It is interesting to see whether RGN work in those settings and can we give some theoretical guarantees there.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Second, throughout the applications, we assume the noise is Gaussian distributed. In the scenarios that the noise is heavy-tailed or the data have outliers (Cai et al, 2021), we would like to consider using the robust loss (e.g., l 1 loss or Huber loss) in (18) instead of the l 2 loss or consider quantile tensor regression (Lu et al, 2020). It is interesting to see whether RGN work in those settings and can we give some theoretical guarantees there.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…See the recent survey on this line of work at Cai and Wei (2018); Uschmajew and Vandereycken (2020). Moreover, Riemannian manifold optimization method has been applied for various problems on low-rank tensor estimation, such as tensor regression (Cai et al, 2020;Kressner et al, 2016), tensor completion (Rauhut et al, 2015;Kasai and Mishra, 2016;Dong et al, 2021;Kressner et al, 2014;Heidel and Schulz, 2018;Xia and Yuan, 2017;Steinlechner, 2016;Da Silva and Herrmann, 2015), and robust tensor PCA (Cai et al, 2021). These papers mostly focus on the first-order Riemannian optimization methods, possibly due to the hardness of deriving the exact expressions of the Riemannian Hessian.…”
Section: Related Literaturementioning
confidence: 99%
“…While this paper focuses on deterministic RPCA approaches, the stochastic RPCA approaches, e.g., partialGD [8], PG-RMC [9], and IRCUR [11], have shown promising speed advantage, especially for large-scale problems. Another future direction is to explore robust tensor decomposition incorporate with deep learning, as some preliminary studies have shown the advantages of tensor structure in certain machine learning tasks [45,46].…”
Section: Discussionmentioning
confidence: 99%
“…To resolve this issue, it is usually assumed that the information T * carries spreads fairly among almost all its entries. One concept characterizing this information spread is by the spikiness of T * (Yuan and Zhang, 2016;Cai et al, 2021b) defined by…”
Section: Tt-format Tensor Completion and Incoherence Conditionmentioning
confidence: 99%
“…The gist of these algorithms is to view the tensor of interest as a point on the Riemannian manifold (Holtz et al, 2012), for example, the collection of tensors with a bounded Tucker-rank or TT-rank., and then to adapt the Riemannian gradient descent algorithm (RGrad) for minimizing the associated objective function. An incomplete list of representable works of RGrad for matrix and tensor applications includes the works of Steinlechner (2016); Wei et al (2016b,a); Kressner et al (2014); Cai et al (2021b). Similarly, the TT-format tensor completion can be recast to an unconstrained problem over the Riemannian manifold and is numerically solvable via RGrad (Wang et al, 2019).…”
Section: Introductionmentioning
confidence: 99%