2020
DOI: 10.1088/2632-2153/abad87
|View full text |Cite
|
Sign up to set email alerts
|

Randomized algorithms for fast computation of low rank tensor ring model

Abstract: This work deals with developing two fast randomized algorithms for computing the generalized tensor singular value decomposition (GTSVD) based on the tubal product (t-product). The random projection method is utilized to compute the important actions of the underlying data tensors and use them to get small sketches of the original data tensors, which are easier to handle. Due to the small size of the sketch tensors, deterministic approaches are applied to them to compute their GTSVDs. Then, from the GTSVD of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 102 publications
(46 reference statements)
0
12
0
Order By: Relevance
“…Another advantage of the STHSOVD compared to the THOSVD is that in the STHOSVD algorithm, the core tensor is automatically computed by the algorithm in the last step while in the THOSVD algorithm, all factor matrices are first computed and then the core tensor is computed through formulation (17) or (19). This may lead to the intermediate data explosion phenomenon [15].…”
Section: A Sequentially Truncated Hosvd (Sthosvd) Algorithmmentioning
confidence: 99%
See 4 more Smart Citations
“…Another advantage of the STHSOVD compared to the THOSVD is that in the STHOSVD algorithm, the core tensor is automatically computed by the algorithm in the last step while in the THOSVD algorithm, all factor matrices are first computed and then the core tensor is computed through formulation (17) or (19). This may lead to the intermediate data explosion phenomenon [15].…”
Section: A Sequentially Truncated Hosvd (Sthosvd) Algorithmmentioning
confidence: 99%
“…• Maxvol-based low-rank matrix approximation [98], [99] • Cross2D matrix approximation [100], [101] • Discrete Empirical Interpolatory Method (DEIM) [103], [104] • Pivoted QR decomposition [105] It is know that the quality of the cross approximation quite depends on the module of the determinant of the intersection matrix 19 which is called matrix volume. More precisely, a set of columns and rows should be selected with an intersection matrix whose volume is as much as possible.…”
Section: B Randomized Sampling Tucker Decompositionmentioning
confidence: 99%
See 3 more Smart Citations