Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600)
DOI: 10.1109/cec.2002.1006261
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of noisy fitness functions by means of genetic algorithms using history of search with test of estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
20
0
4

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 53 publications
(25 citation statements)
references
References 6 publications
0
20
0
4
Order By: Relevance
“…In this study, noise is implemented as an additive normal distributed perturbation with zero mean. It is assumed that noise has a disruptive influence on the value of each individual in the objective space [3], [6], [19], [20], [33], i.e., Table 1 are modified in the form of Eqn. (1) in order to include the influence of noise.…”
Section: B Noisy Objective Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this study, noise is implemented as an additive normal distributed perturbation with zero mean. It is assumed that noise has a disruptive influence on the value of each individual in the objective space [3], [6], [19], [20], [33], i.e., Table 1 are modified in the form of Eqn. (1) in order to include the influence of noise.…”
Section: B Noisy Objective Functionsmentioning
confidence: 99%
“…According to Beyer [4], noise reduces EA's convergence rate and causes convergence to a sub-optimal solution. Researchers have sought to reduce the detrimental effects of noise by means of appropriate population sizing [17], [30], fitness averaging and fitness estimation [6], [33] and specific selection mechanism [32], [2]. There also exist more complex methods.…”
Section: Introductionmentioning
confidence: 99%
“…It has also been concluded that thresholding requires a modified adaptation rule for the mutation rate. Kita and Sano [12] introduced a memorybased fitness evaluation GA (MFEGA). The main idea of MFEGA is to store the sampled fitness values into the memory as a search history, and then estimate the fitness values for points of interest using the history array.…”
Section: Introductionmentioning
confidence: 99%
“…Takahashi et al [58] use fitness estimation from a statistical model of the history of solutions to deal with noisy fitness functions; the example given is on a weight vector with discrete values. This builds on the earlier work with continuous functions in [42]. Finally, in [34] the author builds a surrogate from a Gibbs model which is derived from the distribution learnt by an EDA.…”
mentioning
confidence: 99%
“…A model may be used where no explicit fitness function exists such as in evolutionary art and music [22]. Further, a fitness model may be employed to simplify the search by reducing noise [42,58,10] or smoothing a multimodal landscape [63].…”
mentioning
confidence: 99%