ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9053849
|View full text |Cite
|
Sign up to set email alerts
|

Extrapolated Alternating Algorithms for Approximate Canonical Polyadic Decomposition

Abstract: Tensor decompositions have become a central tool in machine learning to extract interpretable patterns from multiway arrays of data. However, computing the approximate Canonical Polyadic Decomposition (aCPD), one of the most important tensor decomposition model, remains a challenge. In this work, we propose several algorithms based on extrapolation that improve over existing alternating methods for aCPD. We show on several simulated and real data sets that carefully designed extrapolation can significantly imp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(16 citation statements)
references
References 16 publications
(19 reference statements)
0
15
0
Order By: Relevance
“…Another possible extension is support for other loss functions, such as the KL-divergence for count data or weighted least squares for missing data. It may also be beneficial to use Nesterovtype extrapolation, which has been successfully used to speed up ALS schemes for CP [36,44] as well as ALS and flexible coupling schemes for PARAFAC2 [50].…”
Section: Discussionmentioning
confidence: 99%
“…Another possible extension is support for other loss functions, such as the KL-divergence for count data or weighted least squares for missing data. It may also be beneficial to use Nesterovtype extrapolation, which has been successfully used to speed up ALS schemes for CP [36,44] as well as ALS and flexible coupling schemes for PARAFAC2 [50].…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, it can be worthwhile to consider Nesterov-type acceleration to increase efficiency of the AO-ADMM framework. Accelerations of this type have previously been applied to ALS for fitting CP decompositions [74,75].…”
Section: Discussionmentioning
confidence: 99%
“…[13][14][15][16] Recently, several accelerated versions of ALS exploit the idea of extrapolation to accelerate the convergence of ALS and they have demonstrated strong performance against ALS. 17,18 A key to these algorithms is the use of sophisticated adaptive extrapolation, but they lack a mechanism to directly address the convergence bottlenecks. Our work proposes an adaptive random perturbation mechanism to explicitly overcome the convergence bottlenecks.…”
Section: Motivationmentioning
confidence: 99%
“…1 The alternating least squares (ALS) algorithm is often used for computing the CP decomposition, but it suffers from slow convergence in many cases. Following a number of previous works, [13][14][15][16][17] we propose a new accelerated ALS algorithm.…”
Section: Introductionmentioning
confidence: 99%