2022
DOI: 10.1007/978-3-031-09677-8_26
|View full text |Cite
|
Sign up to set email alerts
|

Offline Data-Driven Evolutionary Optimization Algorithm Using K-Fold Cross

Mengzhen Wang,
Yawen Shan,
Fei Xu
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…An essential measurement for evaluating DDEAs is approximation errors of the surrogate model, which directly determines the optimization accuracy to some degree [4]. Since an excellent surrogate model can greatly reduce the approximation error, the selection of a suitable surrogate model becomes an important part of constructing DDEAs [8]. In addition, how to make full use of the existing data to train the model is also an essential issue.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…An essential measurement for evaluating DDEAs is approximation errors of the surrogate model, which directly determines the optimization accuracy to some degree [4]. Since an excellent surrogate model can greatly reduce the approximation error, the selection of a suitable surrogate model becomes an important part of constructing DDEAs [8]. In addition, how to make full use of the existing data to train the model is also an essential issue.…”
Section: Introductionmentioning
confidence: 99%
“…Most machine learning models and neural networks can be used as surrogates, including radial basis functions [8], polynomial response surface methods [9], Kriging methods [10,11], support vector machines [12], artificial neural networks [13], and so forth. Of these, the radial basis function(RBF) has advantages comprising fast convergence speed and strong robustness, and the approximation accuracy of neural networks (NN) is better than that of most other machine learning models.…”
Section: Introductionmentioning
confidence: 99%