2018
DOI: 10.1002/nla.2175
|View full text |Cite
|
Sign up to set email alerts
|

A Riemannian trust‐region method for low‐rank tensor completion

Abstract: Summary The goal of tensor completion is to fill in missing entries of a partially known tensor (possibly including some noise) under a low‐rank constraint. This may be formulated as a least‐squares problem. The set of tensors of a given multilinear rank is known to admit a Riemannian manifold structure; thus, methods of Riemannian optimization are applicable. In our work, we derive the Riemannian Hessian of an objective function on the low‐rank tensor manifolds using the Weingarten map, a concept from differe… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
20
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 22 publications
(22 citation statements)
references
References 29 publications
(84 reference statements)
0
20
0
Order By: Relevance
“…Implicit temporal integrators can mitigate this problem, but they require the development of linear solvers on tensor manifolds with constant rank. This can be achieved, e.g., by utilizing Riemannian optimization algorithms [49,44,45,23], or alternating least squares [14,29,43,6]. Let us discretize the spacial derivatives in (29) with second-order centered finite differences on a tensor product evenly-spaced grid in each variable.…”
Section: Stiffness In High-dimensional Pdesmentioning
confidence: 99%
“…Implicit temporal integrators can mitigate this problem, but they require the development of linear solvers on tensor manifolds with constant rank. This can be achieved, e.g., by utilizing Riemannian optimization algorithms [49,44,45,23], or alternating least squares [14,29,43,6]. Let us discretize the spacial derivatives in (29) with second-order centered finite differences on a tensor product evenly-spaced grid in each variable.…”
Section: Stiffness In High-dimensional Pdesmentioning
confidence: 99%
“…In [16] we called this type of problem a parameter identification problem (PIP). Application areas where PIPs originate are data completion with low-rank matrix and tensor decompositions [15,24,39,45,62,66], geometric modeling [19,49,63], computer vision [33,38,51], and phase retrieval problems in signal processing [9,12].…”
Section: Introductionmentioning
confidence: 99%
“…The concept of Riemannian optimization, i.e. optimizing over parameters that live on a smooth manifold is well studied and has gained increasing interest in the domain of Data Science, for example for tensor completion problems (Heidel and Schulz 2018). However, the idea is quite new for Gaussian Mixture Models and Sra (2015, 2020) showed promising results with a Riemannian LBFGS and Riemannian Stochastic Gradient Descent Algorithm.…”
Section: Introductionmentioning
confidence: 99%