2022
DOI: 10.48550/arxiv.2205.02304
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Operator inference for non-intrusive model reduction with quadratic manifolds

Abstract: This paper proposes a novel approach for learning a data-driven quadratic manifold from highdimensional data, then employing the quadratic manifold to derive efficient physics-based reduced-order models. The key ingredient of the approach is a polynomial mapping between high-dimensional states and a low-dimensional embedding. This mapping comprises two parts: a representation in a linear subspace (computed in this work using the proper orthogonal decomposition) and a quadratic component. The approach can be vi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 42 publications
0
7
0
Order By: Relevance
“…We remind that (11) is well posed (see [10] for the existence and the uniqueness of solutions to problem (11)) and we refer to the notations of [10].…”
Section: The Continuous Problemmentioning
confidence: 99%
See 3 more Smart Citations
“…We remind that (11) is well posed (see [10] for the existence and the uniqueness of solutions to problem (11)) and we refer to the notations of [10].…”
Section: The Continuous Problemmentioning
confidence: 99%
“…The semi-discrete form of the variational problem (11) writes for the fine mesh (similarly for the coarse mesh):…”
Section: The Various Discretizationsmentioning
confidence: 99%
See 2 more Smart Citations
“…For arbitrary inputs, which are out of the distribution (OOD), further training is required but this may be relatively light if the input space is sampled sufficiently. We note here that a neural operator is very different from a reduced order model (ROM) that is restricted to a very small subset of conditions and lacks generalization properties due to the under-parameterization in such methods [10,11]. Another important property of DeepONet is that it is based on a universal approximation theorem for operators [12,9], and more recent theoretical work [13] has shown that DeepONet can break the curse of dimensionality in the input space, unlike approaches based on ROM for parameterized PDEs [14].…”
Section: Introductionmentioning
confidence: 99%