2021
DOI: 10.1137/20m1321838
|View full text |Cite
|
Sign up to set email alerts
|

Time Integration of Tree Tensor Networks

Abstract: Dynamical low-rank approximation by tree tensor networks is studied for the datasparse approximation to large time-dependent data tensors and unknown solutions of tensor differential equations. A time integration method for tree tensor networks of prescribed tree rank is presented and analyzed. It extends the known projector-splitting integrators for dynamical low-rank approximation by matrices and Tucker tensors and is shown to inherit their favorable properties. The integrator is based on recursively applyin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 30 publications
(24 citation statements)
references
References 30 publications
(79 reference statements)
0
19
0
Order By: Relevance
“…Thus, we can conclude that DLRA provides an opportunity to battle the curse of dimensionality, since the required memory to achieve a satisfactory solution approximation grows moderately with dimension. When employing tenor approximations as presented in [48], we expect linear instead of exponential growth with respect to the dimension. However, we leave an extension to higher uncertain domains in which this strategy becomes crucial to future work.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Thus, we can conclude that DLRA provides an opportunity to battle the curse of dimensionality, since the required memory to achieve a satisfactory solution approximation grows moderately with dimension. When employing tenor approximations as presented in [48], we expect linear instead of exponential growth with respect to the dimension. However, we leave an extension to higher uncertain domains in which this strategy becomes crucial to future work.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…In order to further increase the uncertain dimension efficiently, we aim to perform further splitting of the random domain according to [48]. Here, the unconventional integrator will be of high interest, since it allows for parallel solves of all spatial and uncertain basis functions.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The PSI has been shown to be stable with respect to almost or fully rank deficient wavefunction parametrizations in its MCTDH 36 and ML-MCTDH formulations, 31,33,34,39 including the important subcategory of matrix product states (maximally layered trees with physical degrees of freedom at each layer). 32,40,41 The stability of the PSI with respect to nearly rankdeficient wavefunction parametrizations can be attributed to the fact that the EOMs solved are always well-conditioned.…”
Section: E Discussion Of the Psimentioning
confidence: 99%
“…A notable exception are TTNSs, for which a DMRG-like optimization strategy has been devised. 56 Moreover, the recently developed time-dependent TTNS variant 66,67 has paved the route towards the optimization of TTNS with imaginary-time propagation algorithms. Although these developments could set the ground for new efficient tensor-based methods in the near future, DMRG currently provides the optimal balance between the complexity of the tensor factorization and of its optimization.…”
Section: Tensor Network For Quantum Statesmentioning
confidence: 99%