2021
DOI: 10.48550/arxiv.2111.10448
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Parallel algorithms for computing the tensor-train decomposition

Abstract: The tensor-train (TT) decomposition expresses a tensor in a data-sparse format used in molecular simulations, high-order correlation functions, and optimization. In this paper, we propose four parallelizable algorithms that compute the TT format from various tensor inputs: (1) Parallel-TTSVD for traditional format, (2) PSTT and its variants for streaming data, (3) Tucker2TT for Tucker format, and (4) TT-fADI for solutions of Sylvester tensor equations. We provide theoretical guarantees of accuracy, paralleliza… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(11 citation statements)
references
References 44 publications
(64 reference statements)
0
11
0
Order By: Relevance
“…Theorem 7 below. We again remark that this independent set of equations is similar to the ones presented in a recent work [17]. However the motivation there is to determine the cores in a parallel fashion and not for improving statistical estimation.…”
Section: Main Idea Of the Algorithmmentioning
confidence: 60%
See 4 more Smart Citations
“…Theorem 7 below. We again remark that this independent set of equations is similar to the ones presented in a recent work [17]. However the motivation there is to determine the cores in a parallel fashion and not for improving statistical estimation.…”
Section: Main Idea Of the Algorithmmentioning
confidence: 60%
“…Throughout this section, we assume that the input p of TT-RS (Algorithm 1) is a Markov model, that is, p is a probability density function and satisfies (17) p(x 1 , . .…”
Section: Application Of Tt-rs To Markov Modelmentioning
confidence: 99%
See 3 more Smart Citations