2017
DOI: 10.1051/m2an/2016031
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical model reduction of nonlinear partial differential equations based on the adaptive empirical projection method and reduced basis techniques

Abstract: In this paper we extend the hierarchical model reduction framework based on reduced basis techniques recently introduced in [46] for the application to nonlinear partial differential equations. The major new ingredient to accomplish this goal is the introduction of the adaptive empirical projection method, which is an adaptive integration algorithm based on the (generalized) empirical interpolation method [4,40]. Different from other partitioning concepts for the empirical interpolation method we perform an ad… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 53 publications
1
12
0
Order By: Relevance
“…From our experience in the numerical tests, setting different tolerances for the RB and EIM approximation, with tol EI < tol RB proves to give the best approximation. A similar observation is also noted in Reference 50; however, no simultaneous enrichment is considered there. Still, it is not entirely clear how small the (D)EIM approximation tolerance has to be when compared to the RB tolerance.…”
Section: Adaptivitysupporting
confidence: 84%
See 1 more Smart Citation
“…From our experience in the numerical tests, setting different tolerances for the RB and EIM approximation, with tol EI < tol RB proves to give the best approximation. A similar observation is also noted in Reference 50; however, no simultaneous enrichment is considered there. Still, it is not entirely clear how small the (D)EIM approximation tolerance has to be when compared to the RB tolerance.…”
Section: Adaptivitysupporting
confidence: 84%
“…Previous works dealing with the simultaneous enrichment of RB and (D)EIM bases have noted the issue of (D)EIM plateaus. In fact, two different notions of plateauing have been observed and presented in References 5,16,48‐50, and 51. In References 5,50, and 51, the authors note that when the number of basis elements of the EIM approximation is fixed at some small value, an increase in the number of RB vectors does not result in an improvement in the overall error.…”
Section: Adaptivitymentioning
confidence: 99%
“…In practice, however, in order to obtain an accurate reduced-order model and corresponding small errorestimates, it will be necessary to construct RB-and EIM-bases in such a way that the size of residuals and EIM-errors is balanced appropriately. We do not address this issue here and refer for instance to [19,60].…”
Section: Corollary 42 Under the Assumptions Of The Previous Theorem It Holdsmentioning
confidence: 99%
“…Alternatively, to obtain a computational more feasible offline stage one might let the strong greedy run on a small test set with relatively high tolerance and use a hierarchical a posteriori error estimator on the large(r) training set, which was proposed in a slightly different context in [24]. Another idea might be to keep a second test training set during the greedy algorithm.…”
Section: Basis Generationmentioning
confidence: 99%