2014
DOI: 10.1007/s00158-013-1029-z
|View full text |Cite
|
Sign up to set email alerts
|

Metamodel-assisted optimization based on multiple kernel regression for mixed variables

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
22
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 48 publications
(22 citation statements)
references
References 18 publications
0
22
0
Order By: Relevance
“…This overview presents important stepping stones [76] and interesting applications in the field, thus showcasing the development. For a more extensive table we refer to the tabular overview in the supplemental material of this article 1 .…”
Section: Strategies For Dealing With Discrete Structuresmentioning
confidence: 99%
“…This overview presents important stepping stones [76] and interesting applications in the field, thus showcasing the development. For a more extensive table we refer to the tabular overview in the supplemental material of this article 1 .…”
Section: Strategies For Dealing With Discrete Structuresmentioning
confidence: 99%
“…Most data-driven optimization relies on surrogate models to assist the optimizer to guide the search [11]- [13]. Many machine learning methods can be employed to construct surrogates, including artificial neural networks (ANNs) [14], [15] polynomial regression (PR) [16], support vector machines (SVMs) [17], radial basis function (RBF) networks [18]- [20], and Gaussian Processes (GPs). GPs are also known as Kriging or design and analysis of computer experiment models [21]- [23].…”
Section: Introductionmentioning
confidence: 99%
“…In surrogateassisted evolutionary algorithms, surrogate models are employed to replace in part the time-consuming exact function evaluations for saving computational cost because the computational effort required to build and use surrogates is usually much lower than that for expensive fitness evaluations [17], [18]. The most commonly used surrogate models include polynomial regression (PR) [19], also known as response surface methodology [19], support vector machines (SVMs) [20], [21], [22], artificial neural networks (ANNs) [12], [23], [24], radial basis function (RBF) networks [25], [26], [27], [28], [29], and Gaussian Processes (GPs), also referred as to Kriging or design and analysis of computer experiment models [26], [30], [31], [32], [33], [34]. The surrogate-assisted metaheuristic algorithms reported in the literature can be largely classified into the following categories:…”
mentioning
confidence: 99%
“…From the third iteration onward, the following procedure will be undertaken to evaluate the fitness of all particles (lines [5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21][22]. The fitness of all particles in the current swarm will first be approximated by the RBF network and saved as f RBF (x i ), i = 1, 2, .…”
mentioning
confidence: 99%