2016
DOI: 10.1145/2996355
|View full text |Cite
|
Sign up to set email alerts
|

A Systematic Literature Review of Adaptive Parameter Control Methods for Evolutionary Algorithms

Abstract: Evolutionary algorithms (EAs) are robust stochastic optimisers that perform well over a wide range of problems. Their robustness, however, may be affected by several adjustable parameters, such as mutation rate, crossover rate, and population size. Algorithm parameters are usually problem-specific, and often have to be tuned not only to the problem but even the problem instance at hand to achieve ideal performance. In addition, research has shown that different parameter values may be optimal at different stag… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
58
0
1

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 117 publications
(59 citation statements)
references
References 134 publications
0
58
0
1
Order By: Relevance
“…The global search behavior of EAs, however, comes at the cost of a less focused search when the optimization converges, which can result in performances losses in the final parts of the optimization process. The question how to most effectively combine the best of both worlds is the driving force behind research on parameter control [KHE15, AM16,EHM99], adaptive operator selection [MLS10,FCSS10], and hyper-heuristics [BGH + 13], which are the most prominent umbrella terms for adjusting the structure of the search behavior to the current needs of an iterative optimization process.…”
Section: Introductionmentioning
confidence: 99%
“…The global search behavior of EAs, however, comes at the cost of a less focused search when the optimization converges, which can result in performances losses in the final parts of the optimization process. The question how to most effectively combine the best of both worlds is the driving force behind research on parameter control [KHE15, AM16,EHM99], adaptive operator selection [MLS10,FCSS10], and hyper-heuristics [BGH + 13], which are the most prominent umbrella terms for adjusting the structure of the search behavior to the current needs of an iterative optimization process.…”
Section: Introductionmentioning
confidence: 99%
“…Readers interested in empirical works on parameter control are referred to[KHE15] for an exhaustive survey. Additional pointers can be found in the systematic literature review[AM16], the book chapter[EMSS07] (and other book chapters in the same collection) and the seminal paper[EHM99].…”
mentioning
confidence: 99%
“…After training, the NN weights were saved and used for the testing (online) phase. 1 For testing, each DE-DDQN variant was independently run 25 times on each test problem and each run was stopped when either absolute error difference from the optimum is smaller than 10 −8 or 10 4 function evaluations are exhausted. Mean and standard deviation of the final error values achieved by each of the 25 runs are reported in Table 3.…”
Section: Training and Testingmentioning
confidence: 99%
“…There are multiple AOS methods proposed in the literature [1,9,12] and several of them are based on reinforcement learning (RL) techniques such as probability matching [8,23], multi-arm bandits [9], Q(λ) learning [20] and SARSA [4,5,22], among others [10]. These RL methods use one or few features to capture the state of the algorithm at each generation, select an operator to be applied and calculate a reward from this application.…”
Section: Introductionmentioning
confidence: 99%