2010
DOI: 10.1137/09076578x
|View full text |Cite
|
Sign up to set email alerts
|

Dynamical Tensor Approximation

Abstract: Abstract. For the approximation of time-dependent data tensors and of solutions to tensor differential equations by tensors of low Tucker rank, we study a computational approach that can be viewed as a continuous-time updating procedure. This approach works with the increments rather than the full tensor and avoids the computation of decompositions of large matrices. In this method, the derivative is projected onto the tangent space of the manifold of tensors of Tucker rank (r 1 , . . . , r N ) at the current … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
160
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 119 publications
(163 citation statements)
references
References 15 publications
0
160
0
Order By: Relevance
“…Differential equations for the factors of a low-rank factorization similar to the singular value decomposition were derived and their approximation properties were studied. Extensions to time-dependent tensors in various tensor formats were given in [2,12,18,19]; see also [15] for a review of dynamical low-rank approximation.The approach yields differential equations on low-rank matrix and tensor manifolds, which need to be solved numerically. Recently, very efficient integrators based on splitting the projection onto the tangent space of the low-rank manifold have been proposed and studied for matrices and for tensors in the tensor-train format in [16] and [17], respectively.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Differential equations for the factors of a low-rank factorization similar to the singular value decomposition were derived and their approximation properties were studied. Extensions to time-dependent tensors in various tensor formats were given in [2,12,18,19]; see also [15] for a review of dynamical low-rank approximation.The approach yields differential equations on low-rank matrix and tensor manifolds, which need to be solved numerically. Recently, very efficient integrators based on splitting the projection onto the tangent space of the low-rank manifold have been proposed and studied for matrices and for tensors in the tensor-train format in [16] and [17], respectively.…”
mentioning
confidence: 99%
“…Differential equations for the factors of a low-rank factorization similar to the singular value decomposition were derived and their approximation properties were studied. Extensions to time-dependent tensors in various tensor formats were given in [2,12,18,19]; see also [15] for a review of dynamical low-rank approximation.…”
mentioning
confidence: 99%
“…There are two straightforward ways of solving (3.116). The original approach of Koch and Lubich [2007] (later generalized to the Tucker and TT models [Koch and Lubich, 2010]) is to write down ordinary differential equations for the parameters U(t), S(t), V(t) of the SVD like decomposition in the form…”
Section: Dynamical Low-rank Matrix/tensor Approximationmentioning
confidence: 99%
“…[24,25]. In this approach, the solution of an evolution equation is represented by a sum of rank-1 tensor products which are propagated along with the solution.…”
Section: Half Step In U(t X)mentioning
confidence: 99%