2018
DOI: 10.1002/nla.2202
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinearly preconditioned L‐BFGS as an acceleration mechanism for alternating least squares with application to tensor decomposition

Abstract: Summary We derive nonlinear acceleration methods based on the limited‐memory Broyden–Fletcher–Goldfarb–Shanno (L‐BFGS) update formula for accelerating iterative optimization methods of alternating least squares (ALS) type applied to canonical polyadic and Tucker tensor decompositions. Our approach starts from linear preconditioning ideas that use linear transformations encoded by matrix multiplications and extends these ideas to the case of genuinely nonlinear preconditioning, where the preconditioning operati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
25
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(28 citation statements)
references
References 58 publications
(201 reference statements)
1
25
0
Order By: Relevance
“…The convergence theory of Nesterov's accelerated gradient method for convex problems does not apply in our case due to the non-convex setting of the CP problem, and because we accelerate ALS steps instead of gradient steps. In fact, in the context of nonlinear convergence acceleration for ALS, few theoretical results on convergence are available [2,3,4]. We will, however, demonstrate numerically, for representative synthetic and real-world test problems, that our Nesterov-accelerated ALS methods are competitive with or outperform existing acceleration methods for ALS.…”
Section: 3mentioning
confidence: 89%
See 1 more Smart Citation
“…The convergence theory of Nesterov's accelerated gradient method for convex problems does not apply in our case due to the non-convex setting of the CP problem, and because we accelerate ALS steps instead of gradient steps. In fact, in the context of nonlinear convergence acceleration for ALS, few theoretical results on convergence are available [2,3,4]. We will, however, demonstrate numerically, for representative synthetic and real-world test problems, that our Nesterov-accelerated ALS methods are competitive with or outperform existing acceleration methods for ALS.…”
Section: 3mentioning
confidence: 89%
“…(1.4), by an ALS step. Replacing gradient directions by update directions provided by ALS is essentially also the approach taken in [2,3,4] to obtain nonlinear acceleration of ALS by NGMRES, NCG and LBFGS; in the case of Nesterov's method the procedure is extremely simple and easy to implement. However, applying this procedure directly fails for several reasons.…”
Section: 3mentioning
confidence: 99%
“…In (32), for each iteration step k, one is free to pick H 0 k . In the original implementation of the algorithm, in order to reduce the condition numbers of H k , the diagonal is scaled with the Cholesky factor δ k , [58],…”
Section: Preconditioningmentioning
confidence: 99%
“…In the limited-memory (L-BFGS) variant, [22,23], the approximation to H is constructed from a small number of vectors by a rank-one update formula, see Eqn. (32) below. The resulting algorithm is still considered the state-ofthe-art method when huge systems of equations with a very large number of unknowns need to get solved.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation