2021
DOI: 10.48550/arxiv.2107.10867
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A local approach to parameter space reduction for regression and classification tasks

Abstract: Frequently, the parameter space, chosen for shape design or other applications that involve the definition of a surrogate model, present subdomains where the objective function of interest is highly regular or well behaved. So, it could be approximated more accurately if restricted to those subdomains and studied separately. The drawback of this approach is the possible scarsity of data in some applications, but in those, where a quantity of data, moderately abundant considering the parameter space dimension a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

3
1

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 29 publications
(50 reference statements)
0
6
0
Order By: Relevance
“…Future works will focus on improving the accuracy of constraints evaluations, for example with a multi-fidelity approximation of the scalar output and not only for the reconstruction of the entire field. 23 Another possibility is the exploitation of local information with local active subspaces 64 or nonlinear techniques, based on kernels 65 or level-sets, 66,67 to further improve the regression performance of the low-fidelity model. Other physical constraints can also be considered such as the position of the center of mass.…”
Section: Discussionmentioning
confidence: 99%
“…Future works will focus on improving the accuracy of constraints evaluations, for example with a multi-fidelity approximation of the scalar output and not only for the reconstruction of the entire field. 23 Another possibility is the exploitation of local information with local active subspaces 64 or nonlinear techniques, based on kernels 65 or level-sets, 66,67 to further improve the regression performance of the low-fidelity model. Other physical constraints can also be considered such as the position of the center of mass.…”
Section: Discussionmentioning
confidence: 99%
“…Further investigations could involve the use of different fidelities based on extensions of AS, such as kernel active subspaces [40], or local active subspaces [41]. This could also greatly improve datadriven non-intrusive reduced order methods [44,45,46] through modal coefficients reconstruction and prediction for parametric problems [52].…”
Section: Conclusion and Future Perspectivesmentioning
confidence: 99%
“…Reduction in parameter space through AS has been proven successful in a diverse range of applications such as: shape optimization [29,15,12,10], car aerodynamics studies [33], hydrologic models [19], naval and nautical engineering [51,31], coupled with intrusive reduced order methods in cardiovascular studies [48], in CFD problems in a data-driven setting [11,50]. A kernel-based extension of AS for both scalar and vectorial functions can be found in [40], while for a new local approach to parameter space reduction see [41].…”
Section: Introductionmentioning
confidence: 99%
“…Successful applications of parameter space reduction with active subspaces can be found in many engineering fields: naval and nautical problems, 18 shape optimization, [19][20][21][22] car aerodynamics studies, 23 inverse problems, 24,25 cardiovascular studies coupled with intrusive model order reduction, 26 for the study of high-dimensional parametric PDEs, 27 and in CFD problems in a data-driven setting, 28,29 among others. New extensions of AS have also been developed in the recent years such as AS for multivariate vector-valued functions, 30 a kernel approach for AS for scalar and vectorial functions, 31 a localization extension for both regression and classification tasks, 32 and sequential learning of active subspaces. 33 The multi-fidelity setting has been used to find an active subspace given different fidelity models.…”
Section: Introductionmentioning
confidence: 99%