2020
DOI: 10.1016/j.jcp.2019.109125
|View full text |Cite
|
Sign up to set email alerts
|

Dynamically orthogonal tensor methods for high-dimensional nonlinear PDEs

Abstract: We develop new dynamically orthogonal tensor methods to approximate multivariate functions and the solution of high-dimensional time-dependent nonlinear partial differential equations (PDEs). The key idea relies on a hierarchical decomposition of the approximation space obtained by splitting the independent variables of the problem into disjoint subsets. This process, which can be conveniently be visualized in terms of binary trees, yields series expansions analogous to the classical Tensor-Train and Hierarchi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
43
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

5
3

Authors

Journals

citations
Cited by 31 publications
(43 citation statements)
references
References 80 publications
0
43
0
Order By: Relevance
“…Such approximations allow us to significantly reduce the number of degrees of freedom in the representation of the solution tensor u(t), while maintaining accuracy. Low-rank tensor approximations of (2) can be constructed by using, e.g., rank-constrained temporal integration [34,28,13] on a (smooth) tensor manifold with constant rank [48]. Alternatively, one can utilize the fully discrete scheme (3) followed by a rank-reduction operation.…”
Section: Introductionmentioning
confidence: 99%
“…Such approximations allow us to significantly reduce the number of degrees of freedom in the representation of the solution tensor u(t), while maintaining accuracy. Low-rank tensor approximations of (2) can be constructed by using, e.g., rank-constrained temporal integration [34,28,13] on a (smooth) tensor manifold with constant rank [48]. Alternatively, one can utilize the fully discrete scheme (3) followed by a rank-reduction operation.…”
Section: Introductionmentioning
confidence: 99%
“…We also provided sufficient conditions for consistency, stability and convergence of functional approximation schemes to compute the solution of FDEs, thus extending the well-known Lax-Richtmyer theorem from PDEs to FDEs. As we suggested in [69], these results open the possibility to utilize techniques for highdimensional model representation such as deep neural networks [52,53,79] and numerical tensor methods [17,3,55,7,59,37] to represent nonlinear functionals and compute approximate solutions to functional differential equations. We conclude by emphasizing that the results we obtained in this paper can be extended to real-or complex-valued functionals in compact Banach spaces (see, e.g., [33,65]).…”
Section: Discussionmentioning
confidence: 81%
“…The main results are Theorem 7.1 and Theorem 8.3, which are based on the Trotter-Kato approximation theorem for abstract evolution equations in Banach spaces. The results presented in this paper open the possibility to utilize techniques for high-dimensional function representation such as deep neural networks [73,74,104] and numerical tensor methods [5,10,[24][25][26]55,76,76,82] to approximate nonlinear functionals in terms of high-dimensional functions, and to compute approximate solutions of functional differential equations by solving high-dimensional PDEs.…”
Section: Discussionmentioning
confidence: 96%
“…The series (24) converges in the L 2 p ([0, 2π]) sense, and also pointwise since η is continuous [50]. A substitution of ( 24) into (20) yields…”
Section: Differentials and Derivatives Of Nonlinear Functionalsmentioning
confidence: 99%