Tensors for Data Processing 2022
DOI: 10.1016/b978-0-12-824447-0.00010-8
|View full text |Cite
|
Sign up to set email alerts
|

A Riemannian approach to low-rank tensor learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(23 citation statements)
references
References 24 publications
0
15
0
Order By: Relevance
“…For previous work on SGD on Riemannian manifolds, we refer the reader to [18][19][20][21][22]. The presented technique is completely intrinsic to the manifold N and involves following (approximate) geodesics in the direction of the (negative) gradient of L. To explain this idea in more detail, we first briefly recall the notion of geodesics and refer the reader to [27,28] for a more comprehensive introduction to differential geometry.…”
Section: Stochastic Gradient Descentmentioning
confidence: 99%
See 1 more Smart Citation
“…For previous work on SGD on Riemannian manifolds, we refer the reader to [18][19][20][21][22]. The presented technique is completely intrinsic to the manifold N and involves following (approximate) geodesics in the direction of the (negative) gradient of L. To explain this idea in more detail, we first briefly recall the notion of geodesics and refer the reader to [27,28] for a more comprehensive introduction to differential geometry.…”
Section: Stochastic Gradient Descentmentioning
confidence: 99%
“…In particular, we explain in detail how to perform SGD on Riemannian manifolds arising from a finite-dimensional system of equations. Performing SGD on Riemannian manifolds has been studied before, e.g., [18][19][20][21][22]. Our method, in particular, heavily relies on the Implicit Function Theorem, which is used to construct explicit charts amenable to numerical computations.…”
mentioning
confidence: 99%
“…The construction of this metric aims to approximate the Hessian of the cost function by its "diagonal blocks". These algorithms improve the performance of Euclidean-based algorithms, and are successfully applied to matrix and tensor completion (e.g., [25,19,12,9]). However, the extension to TR is not straightforward since it involves large matrix computation and formulation.…”
Section: Preliminariesmentioning
confidence: 99%
“…Therefore, low-rank matrix decompositions are widely used in matrix completion, which can save the computational cost and storage. In the same spirit, lowrank tensor decompositions play a significant role in tensor completion; applications can be found across various fields, e.g., recommendation systems [19,12], image processing [24], and interpolation of high-dimensional functions [32,21].…”
mentioning
confidence: 99%
See 1 more Smart Citation