The 2003 Congress on Evolutionary Computation, 2003. CEC '03.
DOI: 10.1109/cec.2003.1299639
|View full text |Cite
|
Sign up to set email alerts
|

Comparing neural networks and Kriging for fitness approximation in evolutionary optimization

Abstract: Abstract-Neural networks and the Kriging method are compared for constructing fitness approximation models in evolutionary optimization algorithms. The two models are applied in an identical framework to the optimization of a number of well known test functions. In addition, two different ways of training the approximators are evaluated: In one setting the models are built off-line using data from previous optimization runs and in the other setting the models are built online from the data available from the c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
9
0

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 41 publications
(10 citation statements)
references
References 10 publications
1
9
0
Order By: Relevance
“…The incorporation of the fitness model leads to a higher probability of premature convergence in local minima. This problem of model assistance approaches on multimodal problems is also observed by other authors [12,44].…”
Section: Comparative Studiessupporting
confidence: 57%
See 1 more Smart Citation
“…The incorporation of the fitness model leads to a higher probability of premature convergence in local minima. This problem of model assistance approaches on multimodal problems is also observed by other authors [12,44].…”
Section: Comparative Studiessupporting
confidence: 57%
“…Gaussian Processing [13,42) and Kriging [14,29) are statistical modeling techniques, which are also used for fitness function approximation. A comparison of neural networks and kriging for fitness approximation in evolutionary optimization can be found in (44).…”
Section: Related Workmentioning
confidence: 99%
“…GPs have been equally popular in surrogate-assisted single [23], [24] and multi-objective evolutionary optimization [25]- [28]. However, most GP-assisted EAs have been tested only on low-dimensional problems (up to 10 decision variables) [15], mainly due to the fact that the computational cost of constructing the GP is O(N 3 ), where N is the number of training data [29].…”
Section: Introductionmentioning
confidence: 99%
“…Many machine learning models can be used for surrogates, such as linear, nonlinear or polynomial repression models [31], Kriging or Gaussian processes [32], [33], [34], [35], [36], [37], [38], support vector machines (SVMs) [39], radial basis function (RBF) networks [40], [41], [42], and many other neural networks [43], [44], [45], [46], [47]. Several ideas have been proposed for choosing individuals to be re-evaluated using the original objective functions, which is one key issue in surrogate management.…”
Section: B Surrogate Models and Surrogate Managementmentioning
confidence: 99%