2020
DOI: 10.1016/j.jcp.2020.109326
|View full text |Cite
|
Sign up to set email alerts
|

Rank adaptive tensor recovery based model reduction for partial differential equations with high-dimensional random inputs

Abstract: This work proposes a systematic model reduction approach based on rank adaptive tensor recovery for partial differential equation (PDE) models with high-dimensional random parameters. Since the standard outputs of interest of these models are discrete solutions on given physical grids which are high-dimensional, we use kernel principal component analysis to construct stochastic collocation approximations in reduced dimensional spaces of the outputs. To address the issue of high-dimensional random inputs, we de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 38 publications
(131 reference statements)
0
3
0
Order By: Relevance
“…In this work, we focus on the low rank tensor train representation to construct the random projection f . Tensor decompositions are widely used for data compression [5,[19][20][21][22][23][24]. The Tensor Train (TT) decomposition, also called Matrix Product States (MPS) [25][26][27][28], gives the following benefits-low rank TT-formats can provide compact representations of projection matrices and efficient basic linear algebra operations of matrix-by-vector products [29].…”
Section: Introductionmentioning
confidence: 99%
“…In this work, we focus on the low rank tensor train representation to construct the random projection f . Tensor decompositions are widely used for data compression [5,[19][20][21][22][23][24]. The Tensor Train (TT) decomposition, also called Matrix Product States (MPS) [25][26][27][28], gives the following benefits-low rank TT-formats can provide compact representations of projection matrices and efficient basic linear algebra operations of matrix-by-vector products [29].…”
Section: Introductionmentioning
confidence: 99%
“…Multiway data or multidimensional arrays, represented by tensors, are ubiquitous in real-world applications, such as recommendation systems [1,2], computer version [3], medical imaging [4] and uncertainty quantification [5].…”
Section: Introductionmentioning
confidence: 99%
“…However, there is a fundamental question: how can we determine the tensor rank and the associated model complexity? Because it is hard to exactly determine a tensor rank a-priori [30], existing methods often use a tensor rank pre-specified by the user or use a greedy method to update the tensor rank until convergence [24,31,32]. These methods often offers inaccurate rank estimation and are complicated in computation.…”
Section: Introductionmentioning
confidence: 99%