2016
DOI: 10.1137/15m1010506
|View full text |Cite
|
Sign up to set email alerts
|

Riemannian Optimization for High-Dimensional Tensor Completion

Abstract: Tensor completion aims to reconstruct a high-dimensional data set where the vast majority of entries is missing. The assumption of low-rank structure in the underlying original data allows us to cast the completion problem into an optimization problem restricted to the manifold of fixed-rank tensors. Elements of this smooth embedded submanifold can be efficiently represented in the tensor train (TT) or matrix product states (MPS) format with storage complexity scaling linearly with the number of dimensions. We… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
92
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 76 publications
(92 citation statements)
references
References 36 publications
0
92
0
Order By: Relevance
“…Also the approaches which themselves retreat to alternating least squares [20] treat the iterations as necessity for the minimization of an objective function with regularization term. The penalty term is, as usual, based on the singular values of the output of a microstep (a posteriori), as it is also the case for the work [40] on tensor completion through Riemannian optimization. Although their term may appear similar to the term we derive (cf.…”
Section: Relation To Other Matrix and Tensor Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…Also the approaches which themselves retreat to alternating least squares [20] treat the iterations as necessity for the minimization of an objective function with regularization term. The penalty term is, as usual, based on the singular values of the output of a microstep (a posteriori), as it is also the case for the work [40] on tensor completion through Riemannian optimization. Although their term may appear similar to the term we derive (cf.…”
Section: Relation To Other Matrix and Tensor Methodsmentioning
confidence: 99%
“…• SALSA (Algorithm 4, the algorithm proposed in this work) • RTTC (Riemannian cg for tensor train completion [40]) We explain how ranks are adapted for ALS in Section 9.1, shortly present the idea behind RTTC in Section 9.2, give details for data acquisition and measurements in Section 9.3 as well as tuning parameters in Section 9.4. We analyze the results in the latter Section 9.9.…”
Section: Numerical Experimentsmentioning
confidence: 99%
See 2 more Smart Citations
“…, A d } is the solution of (21) with rank R ≥ 1, we consider one higher rank, i.e., R + 1. While an analogous approach for tensor completion using tensor train decomposition can be found in [29], we here focus on CP decomposition and give the following scheme of rank-one update. The initial factor matrices for rank R + 1 is set to the rank-one updates of the factor matrices of X (R) , i.e.,…”
Section: Rank Adaptive Tensor Recovery (Ratr)mentioning
confidence: 99%