2016
DOI: 10.1134/s1064226916060061
|View full text |Cite
|
Sign up to set email alerts
|

Regression on the basis of nonstationary Gaussian processes with Bayesian regularization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2017
2017
2018
2018

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(22 citation statements)
references
References 12 publications
0
22
0
Order By: Relevance
“…The most popular choice for building surrogate models in the field of engineering design is based on taking Gaussian processes as a parametric family of priors along with assuming there is white noise in the observations i = N (0, σ 2 ). By definition [11,22] Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution. Consequently it is completely specified by its mean function m(x) = E[f (x)] and the covariance function…”
Section: Theoremmentioning
confidence: 99%
“…The most popular choice for building surrogate models in the field of engineering design is based on taking Gaussian processes as a parametric family of priors along with assuming there is white noise in the observations i = N (0, σ 2 ). By definition [11,22] Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution. Consequently it is completely specified by its mean function m(x) = E[f (x)] and the covariance function…”
Section: Theoremmentioning
confidence: 99%
“…Maximum likelihood estimation of a Gaussian process regression model sometimes provides degenerate results -a phenomenon closely connected to overfitting [65,68,48,51]. To regularize the problem and avoid inversion of large ill-conditioned matrices, one can impose a prior distribution on a Gaussian process regression model and then use Bayesian MAP (Maximum A Posteriory) estimates [20,23,11]. In particular in this paper we adopted the approach described in [20]: we impose prior distributions on all parameters of the covariance function and additional hyperprior distributions on parameters of the prior distributions.…”
Section: Gaussian Process Regression For Single Fidelity Datamentioning
confidence: 99%
“…To regularize the problem and avoid inversion of large ill-conditioned matrices, one can impose a prior distribution on a Gaussian process regression model and then use Bayesian MAP (Maximum A Posteriory) estimates [20,23,11]. In particular in this paper we adopted the approach described in [20]: we impose prior distributions on all parameters of the covariance function and additional hyperprior distributions on parameters of the prior distributions. Experiments confirm that such approach allows to avoid ill-conditioned and degenerate cases, that can occur even more often when processing variable fidelity data.…”
Section: Gaussian Process Regression For Single Fidelity Datamentioning
confidence: 99%
“…Yet, GPs have found success on various applications. They are in use on a wide range of fields and use cases ranging from surrogate modeling, experiment design, mining and geo-spatial data to battery health [3], [71], [87], [11], [33], [32], [8], [10], [5], [6], [9]. On techniques for demand forecasting in the FMCG sector, there has been little academic research and not enough efforts to expose practitioners to them.…”
Section: Introductionmentioning
confidence: 99%