2015
DOI: 10.1134/s1064226915120037
|View full text |Cite
|
Sign up to set email alerts
|

Surrogate modeling of multifidelity data for large samples

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(16 citation statements)
references
References 9 publications
0
16
0
Order By: Relevance
“…It performs for each training set a numerical optimization of the technique as well as its parameters [27,28] by minimizing the cross-validation error, see [25]. Among the algorithms scanned by pSeven are the following: ridge regression [29], stepwise regression [30], elastic net [31], Gaussian processes [32], sparse Gaussian processes [33,34], High Dimensional Approximation (HDA) [25,35], and High dimensional approximation combined with Gaussian processes (HDAGP) (this technique is related to artificial neural networks and, more specifically, to the two-layer perceptron with a non-linear activation function [35]). Two desirable features of pSeven are: (i) all data manipulation is done via graphical user interface (GUI) and (ii) it can export the constructed surrogate model as a stand alone function in a number of scientific computing languages, including Matlab, C source for MEX, C source for stand alone program, C header for library, C source for library, functional mock-up interface (FMU) for Co-simulation 1.0 and executable.…”
Section: Regression Model Of U S (R τ τ ) For a Realistic Head Modelmentioning
confidence: 99%
“…It performs for each training set a numerical optimization of the technique as well as its parameters [27,28] by minimizing the cross-validation error, see [25]. Among the algorithms scanned by pSeven are the following: ridge regression [29], stepwise regression [30], elastic net [31], Gaussian processes [32], sparse Gaussian processes [33,34], High Dimensional Approximation (HDA) [25,35], and High dimensional approximation combined with Gaussian processes (HDAGP) (this technique is related to artificial neural networks and, more specifically, to the two-layer perceptron with a non-linear activation function [35]). Two desirable features of pSeven are: (i) all data manipulation is done via graphical user interface (GUI) and (ii) it can export the constructed surrogate model as a stand alone function in a number of scientific computing languages, including Matlab, C source for MEX, C source for stand alone program, C header for library, C source for library, functional mock-up interface (FMU) for Co-simulation 1.0 and executable.…”
Section: Regression Model Of U S (R τ τ ) For a Realistic Head Modelmentioning
confidence: 99%
“…Maximum likelihood estimation of a Gaussian process regression model sometimes provides degenerate results -a phenomenon closely connected to overfitting [65,68,48,51]. To regularize the problem and avoid inversion of large ill-conditioned matrices, one can impose a prior distribution on a Gaussian process regression model and then use Bayesian MAP (Maximum A Posteriory) estimates [20,23,11]. In particular in this paper we adopted the approach described in [20]: we impose prior distributions on all parameters of the covariance function and additional hyperprior distributions on parameters of the prior distributions.…”
Section: Gaussian Process Regression For Single Fidelity Datamentioning
confidence: 99%
“…Nice property of Gaussian process regression is an ability to treat variable fidelity data [30,43,52,37,26,23]: one can construct a surrogate model of a high fidelity function using data both from high and low fidelity sources (e.g., high fidelity evaluations can be obtained using a computational code with a fine mesh, and low fidelity evaluations can be obtained using the same computational code with a coarser mesh). Recent results provide theoretical analysis of obtained models [67,69] and of parameters estimates [24].…”
Section: Introductionmentioning
confidence: 99%
“…Yet, GPs have found success on various applications. They are in use on a wide range of fields and use cases ranging from surrogate modeling, experiment design, mining and geo-spatial data to battery health [3], [71], [87], [11], [33], [32], [8], [10], [5], [6], [9]. On techniques for demand forecasting in the FMCG sector, there has been little academic research and not enough efforts to expose practitioners to them.…”
Section: Introductionmentioning
confidence: 99%