Handbook of Variational Methods for Nonlinear Geometric Data 2020
DOI: 10.1007/978-3-030-31351-7_9
|View full text |Cite
|
Sign up to set email alerts
|

Geometric Methods on Low-Rank Matrix and Tensor Manifolds

Abstract: In this chapter we present numerical methods for low-rank matrix and tensor problems that explicitly make use of the geometry of rank constrained matrix and tensor spaces. We focus on two types of problems: The first are optimization problems, like matrix and tensor completion, solving linear systems and eigenvalue problems. Such problems can be solved by numerical optimization for manifolds, called Riemannian optimization methods. We will explain the basic elements of differential geometry in order to apply s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(22 citation statements)
references
References 109 publications
0
22
0
Order By: Relevance
“…Although the value of the new rank-adaptive method is application dependent, our observations provide insight on the kind of data matrices for which rank-adaptive mechanisms play a valuable role. This suggests that the proposed method might also perform well on other low-rank optimization problems, such as those mentioned in [5,12,16,19].…”
Section: Discussionmentioning
confidence: 86%
See 1 more Smart Citation
“…Although the value of the new rank-adaptive method is application dependent, our observations provide insight on the kind of data matrices for which rank-adaptive mechanisms play a valuable role. This suggests that the proposed method might also perform well on other low-rank optimization problems, such as those mentioned in [5,12,16,19].…”
Section: Discussionmentioning
confidence: 86%
“…Figure 11 shows the singular values of the initial point (17), namely the rank-10 approximation of the zero-filled MovieLens dataset. It is observed that the largest gap can be detected between the first two singular values by the rank reduction (16) for both examples. According to the pre-process (line 3 of Algorithm 1), RRAM-RBB will (a) (b) Fig.…”
Section: Test On Real-world Datasetsmentioning
confidence: 91%
“…A closely related perspective builds upon a geometric fact that the set 11,12]. This means that the problem…”
Section: Coherence and Restricted Isometry Propertymentioning
confidence: 99%
“…At last, tensors of fixed Tucker and TT ranks form smooth embedded submanifolds M r of R n 1 ×...×n d [12]. An iteration of Riemannian gradient descent for Tucker recovery can be written with the help of notation we introduced above:…”
Section: Tensor Completionmentioning
confidence: 99%
“…The main contribution of this paper is a modified IHT algorithm that makes use of the manifold properties (of the smooth part) of the set M k by applying a tangent space projection to the search direction of IHT. This approach is inspired by Riemannian low-rank optimization, which has been shown to be efficient in several applications, including matrix completion and matrix equations; see [30] for an overview. We demonstrate that in the important case of rank-one measurements, which includes the problem of blind deconvolution, the additional tangent space projection allows for a significant reduction of computational cost since the projection of the gradient onto the tangent space can be efficiently realized even for large low-rank matrices.…”
Section: Contribution and Outlinementioning
confidence: 99%