2002
DOI: 10.1080/00207160210939
|View full text |Cite
|
Sign up to set email alerts
|

An Experimental Study of Benchmarking Functions for Genetic Algorithms

Abstract: This paper presents a review and experimental results on the major benchmarking functions used for performance control of Genetic Algorithms (GAs). Parameters considered include the eect of population size, crossover probability and pseudo-random number generators (PNGs). The general computational behavior of two basic GAs models, the Generational Replacement Model (GRM) and the Steady State Replacement Model (SSRM) is evaluated.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
33
0

Year Published

2006
2006
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 147 publications
(33 citation statements)
references
References 9 publications
(9 reference statements)
0
33
0
Order By: Relevance
“…As in Ackley function, the optima of Griewank function are regularly distributed. Penalized functions are difficult due to the combinations of different periods of the sine function (Digalakis and Margaritis 2002;Boyer et al 2005).…”
Section: Experiments 1: Large-scale Unconstrained Optimizationmentioning
confidence: 98%
See 1 more Smart Citation
“…As in Ackley function, the optima of Griewank function are regularly distributed. Penalized functions are difficult due to the combinations of different periods of the sine function (Digalakis and Margaritis 2002;Boyer et al 2005).…”
Section: Experiments 1: Large-scale Unconstrained Optimizationmentioning
confidence: 98%
“…Flat surfaces are obstacles for optimization algorithms which do not have variable step sizes, because they do not give any information as to which direction is favourable (Digalakis and Margaritis 2002). The surface of Schwefel function is composed of a great number of peaks and valleys.…”
Section: Experiments 1: Large-scale Unconstrained Optimizationmentioning
confidence: 99%
“…It is well known that heuristic algorithms based on computational or swarm intelligence are successfully performed to determine the coefficients of the optimization problems more than the analytical solution techniques. Therefore, the GA, which is the heuristic algorithm, is utilized to predict the coefficients of the PID controller and SMC in this study [24,25].…”
Section: Gamentioning
confidence: 99%
“…It is a kind of stochastic optimization algorithm that is used in order to find the global or local optimum of a function [24]. This function may include more than one variable.…”
Section: Introductionmentioning
confidence: 99%
“…In real world applications, the faster computation of accurate results is the ultimate aim. In recent time, a new optimization technique called Teaching learning based optimization [1] is gaining popularity [2][3][4][5][6][7][8] due to its ability to achieve better results in comparatively faster convergence time to techniques like Genetic Algorithms (GA) [9,10], Particle swarm Optimizations (PSO) [11][12][13][14][15][16][17] Differential Evolution (DE) [18][19][20] and some of its variants like DE with Time Varying Scale Factor (DET-VSF), DE with Random Scale Factor (DERSF) [21] etc. The main reason for TLBO being faster to all other contemporary evolutionary techniques is it has no parameters to tune.…”
Section: Introductionmentioning
confidence: 99%