2020
DOI: 10.1137/18m1214123
|View full text |Cite
|
Sign up to set email alerts
|

Multifidelity Dimension Reduction via Active Subspaces

Abstract: We propose a multifidelity dimension reduction method to identify a low-dimensional structure present in many engineering models. The structure of interest arises when functions vary primarily on a low-dimensional subspace of the high-dimensional input space, while varying little along the complementary directions. Our approach builds on the gradient-based methodology of active subspaces, and exploits models of different fidelities to reduce the cost of performing dimension reduction through the computation of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
33
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 35 publications
(33 citation statements)
references
References 50 publications
0
33
0
Order By: Relevance
“…We demonstrate the randomized error estimator for two test cases which are derived from the benchmark problem introduced in [36]. The source code is freely available in Matlab at the address 3 so that all numerical results presented in this section are entirely reproducible.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…We demonstrate the randomized error estimator for two test cases which are derived from the benchmark problem introduced in [36]. The source code is freely available in Matlab at the address 3 so that all numerical results presented in this section are entirely reproducible.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…To obtain the sharpest bounds, both these improvements would require controlling the spectral norm ofΣ X,Y X, − Σ µ rather than its Frobenius norm (which is bounded in Theorem 3.2 above). A detailed argument favoring the spectral norm in the context of active subspaces is also given in [38]. Controlling the spectral norm of the error however appears to be considerably more difficult.…”
Section: Remark 7 (Possible Improvements)mentioning
confidence: 99%
“…This completes the proof of Theorem 3.1. Next, for N X,min, > 0 to be set later, recall the "good" event E 2 in (38) whereby every x ∈ X has at least N X,min, neighbors in Y X, . Conditioned on the event E 2 , ... ΣX,Y X, takes the simpler form of (42), using which we prove the following result in Appendix H.…”
mentioning
confidence: 99%
See 2 more Smart Citations