2012
DOI: 10.1007/s00791-014-0218-7
|View full text |Cite
|
Sign up to set email alerts
|

A note on tensor chain approximation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 22 publications
(21 citation statements)
references
References 24 publications
0
21
0
Order By: Relevance
“…The partial trace operator can be used for describing the tensor chain (TC) decomposition [12,26] simply by slightly modifying the suggested TT representations. Properties of TC decomposition should be more investigated in the future work.…”
Section: Discussionmentioning
confidence: 99%
“…The partial trace operator can be used for describing the tensor chain (TC) decomposition [12,26] simply by slightly modifying the suggested TT representations. Properties of TC decomposition should be more investigated in the future work.…”
Section: Discussionmentioning
confidence: 99%
“…Algorithm 1 computes the quantity Ī“:=Īµā€–Tā€–Fd and requires a manual input of a divisor r 0 of rankĪ“TāŸØ1āŸ©. The choice of r 0 by Zhao et al (and also for a related algorithm based on the skeleton/cross approximation), is to minimize |r0āˆ’rankĪ“TāŸØ1āŸ©r0|, but examples (see Section 5.1) show that this can lead to suboptimal compression ratios.…”
Section: Conversion From Full Format To Trā€formatmentioning
confidence: 99%
“…We convert a tensor given in full format into a TRā€representation and compute its storage cost. We compare the TTā€representation with r 0 =1, Algorithm 1 using a balanced representation with r0=argmin|r0āˆ’rankĪ“TāŸØ1āŸ©r0|, and Algorithm 2, to Algorithm 3. We do not compare to other algorithms for TRā€decompositions, since these have already been compared to the TRā€SVD algorithm in the literature .…”
Section: Computational Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…The canonical polyadic decomposition (CPD) [2,15,16] and the Tucker decomposition [2,27] both generalize the notion of the matrix singular value decomposition (SVD) to higher-order tensors and have, therefore, received a lot of attention. More recent tensor decompositions are the tensor train (TT) [8,9,18,21] and the hierarchical Tucker decomposition [12,13]. It turns out that the latter two decompositions were already known in the quantum mechanics and condensed matter physics communities as the matrix product state (MPS) [23] and the tensor tree network (TTN) [25], respectively.…”
Section: Introductionmentioning
confidence: 99%