2006
DOI: 10.1016/j.tcs.2006.04.004
|View full text |Cite
|
Sign up to set email alerts
|

How the (1+1) ES using isotropic mutations minimizes positive definite quadratic forms

Abstract: The (1+1) evolution strategy (ES), a simple, mutation-based evolutionary algorithm for continuous optimization problems, is analyzed. In particular, we consider the most common type of mutations, namely Gaussian mutations, and the 1 5 -rule for mutation adaptation, and we are interested in how the runtime/number of function evaluations to obtain a predefined reduction of the approximation error depends on the dimension of the search space.The most discussed function in the area of ES is the so-called SPHERE-fu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
52
0
2

Year Published

2008
2008
2020
2020

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 66 publications
(57 citation statements)
references
References 13 publications
(31 reference statements)
3
52
0
2
Order By: Relevance
“…The widely used optimizers are inspired by nature phenomena, which include genetic algorithm (GA) [25], evolution programming (EP) [26,27], evolution strategy (ES) [28,29], differential evolution (DE) [6,30], ant colony optimization (ACO) [31], particle swarm optimization (PSO) [32][33][34][35][36][37], bacterial foraging optimization (BFO) [38], simulated annealing (SA) [39], tabu search (TS) [40], harmony search (HS) [35,36,40], etc. These optimizers facilitated research into the optimization of the subproblems.…”
Section: Q3mentioning
confidence: 99%
“…The widely used optimizers are inspired by nature phenomena, which include genetic algorithm (GA) [25], evolution programming (EP) [26,27], evolution strategy (ES) [28,29], differential evolution (DE) [6,30], ant colony optimization (ACO) [31], particle swarm optimization (PSO) [32][33][34][35][36][37], bacterial foraging optimization (BFO) [38], simulated annealing (SA) [39], tabu search (TS) [40], harmony search (HS) [35,36,40], etc. These optimizers facilitated research into the optimization of the subproblems.…”
Section: Q3mentioning
confidence: 99%
“…In this case, the stochastic convergence of EAs can be studied by analyzing the convergence of a random variable sequence (r.v.s.). The theoretical analyses of real-coded EAs are mainly studied in this way, as in [39,3,15,4,5,26,27]. Additionally, this method has been applied by He and Yao [28,29] to analyses of some discrete-coded EAs, and by Laummans et al [33] to successfully analyze the convergence properties of a discrete-coded MOEA.…”
Section: Definition Of Convergencementioning
confidence: 99%
“…Beyer 2001;Auger 2005;Jägersküpper 2006Jägersküpper , 2008. Let the optimization problem be defined by an objective function f : R n → Y to be minimized, where n denotes the dimensionality of the search space (space of candidate solutions, decision space) and Y the space of cost values.…”
Section: Gaussian Mutations In Evolution Strategiesmentioning
confidence: 99%