2020 IEEE Congress on Evolutionary Computation (CEC) 2020
DOI: 10.1109/cec48606.2020.9185781
|View full text |Cite
|
Sign up to set email alerts
|

Neural Networks for Surrogate-assisted Evolutionary optimization of Chemical Processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…Generally speaking, the fitness approximation mainly includes three procedures: data processing, surrogate model building, and model update and management. In order to approximate the fitness function and build surrogates accurately, various model building methods can be used, including traditional interpolation methods, such as polynomial regression model [34,35] , and machine learning techniques, such as the Kriging model [36−38] , artificial neural networks [39,40] , radial basis function neural networks [41−44] , and random forest [45−47] . To better show the application scope of different surrogates, the comparison of some commonly seen surrogates is presented in Table 1.…”
Section: Fitness Approximationmentioning
confidence: 99%
See 1 more Smart Citation
“…Generally speaking, the fitness approximation mainly includes three procedures: data processing, surrogate model building, and model update and management. In order to approximate the fitness function and build surrogates accurately, various model building methods can be used, including traditional interpolation methods, such as polynomial regression model [34,35] , and machine learning techniques, such as the Kriging model [36−38] , artificial neural networks [39,40] , radial basis function neural networks [41−44] , and random forest [45−47] . To better show the application scope of different surrogates, the comparison of some commonly seen surrogates is presented in Table 1.…”
Section: Fitness Approximationmentioning
confidence: 99%
“…KTA2 [37] Expensive, many-objective Influential point-insensitive model with an adaptive infill criterion 2021 IEEE Transactions on Cybernetics GCS-MOE [38] Expensive, many-objective Multi-task surrogate to approximate subproblems 2019 IEEE Transactions on Cybernetics MTCNP [39] Expensive, multi-task Surrogate-assisted multi-task learning 2020…”
Section: Ieee Transactions On Evolutionary Computationmentioning
confidence: 99%
“…Then in Sect. 4 we explain the machine learning (ML) methods that we used to speed up the optimization process 13. Afterwards, we investigate how these changes improve the performance of the optimization and show that the turnaround time between an engineer and the framework is significantly reduced.…”
Section: Introductionmentioning
confidence: 99%