2018
DOI: 10.1137/17m1148712
|View full text |Cite
|
Sign up to set email alerts
|

Alternating Least Squares as Moving Subspace Correction

Abstract: In this note we take a new look at the local convergence of alternating optimization methods for low-rank matrices and tensors. Our abstract interpretation as sequential optimization on moving subspaces yields insightful reformulations of some known convergence conditions that focus on the interplay between the contractivity of classical multiplicative Schwarz methods with overlapping subspaces and the curvature of low-rank matrix and tensor manifolds. While the verification of the abstract conditions in concr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 15 publications
(18 citation statements)
references
References 15 publications
0
18
0
Order By: Relevance
“…The argument for Ŝ2 is analogous. The above invariance of the AO method allows us to interpret it as a method X +1 = S(X ) in the full matrix space R m×n , or more precisely on the subset of matrices of rank at most k. This viewpoint has been taken in [13] and will be helpful in this work, too. From an algorithmic perspective, the AO viewpoint (2.2) is more useful since it operates on the smaller matrices U and V instead of the full matrix X.…”
Section: Standard Ao Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The argument for Ŝ2 is analogous. The above invariance of the AO method allows us to interpret it as a method X +1 = S(X ) in the full matrix space R m×n , or more precisely on the subset of matrices of rank at most k. This viewpoint has been taken in [13] and will be helpful in this work, too. From an algorithmic perspective, the AO viewpoint (2.2) is more useful since it operates on the smaller matrices U and V instead of the full matrix X.…”
Section: Standard Ao Methodsmentioning
confidence: 99%
“…While the study of global convergence of the AO method (1.3) is usually difficult, its local convergence properties are well-understood [16,14,13]. The local analysis is based on the fact that the linearized version of the method at a critical point (U * , V * ) takes the form of a block Gauss-Seidel method for the Hessian ∇ 2 F (U * , V * ).…”
Section: Introductionmentioning
confidence: 99%
“…We focus here on its accuracy in comparison with the two other methods via numerical experiments. We emphasize that for tensors of order at least 3, convergence can be shown for the (inner) iteration (see [11,25,29,30]). This limit, however, is not a global minimum in general.…”
Section: Methods 2: An Alternating Least Squares Type Iterationmentioning
confidence: 99%
“…Step 1: Given [45]. The modified ALS iteration has recently been considered in Oseledets et al [38] under the name "simultaneous orthogonal iterations". It is shown there that the modified version is equivalent to ALS and, therefore, has the same rate of convergence.…”
Section: The Alternating Least Squares (Als) Methodsmentioning
confidence: 99%