Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation 2006
DOI: 10.1145/1143997.1144082
|View full text |Cite
|
Sign up to set email alerts
|

A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies

Abstract: First, the covariance matrix adaptation (CMA) with rankone update is introduced into the (1+1)-evolution strategy. An improved implementation of the 1/5-th success rule is proposed for step size adaptation, which replaces cumulative path length control. Second, an incremental Cholesky update for the covariance matrix is developed replacing the computational demanding and numerically involved decomposition of the covariance matrix. The Cholesky update can replace the decomposition only for the update without ev… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
116
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 147 publications
(116 citation statements)
references
References 16 publications
(21 reference statements)
0
116
0
Order By: Relevance
“…The restart variant of CMA-ES with iteratively increasing population size (IPOP-CMA-ES) [7] can be considered a parameter-free CMA-ES. The (1+1)-variant of CMA-ES establishes a direct link to GaA by combining the classical (1+1)-ES with a rank-one update of the covariance matrix [8]. In the next subsection, we show that Gaussian Adaptation has been designed in a similar spirit, yet grounding its theoretical justification on a different foundation.…”
Section: Evolution Strategies With Covariance Matrix Adaptationmentioning
confidence: 97%
See 1 more Smart Citation
“…The restart variant of CMA-ES with iteratively increasing population size (IPOP-CMA-ES) [7] can be considered a parameter-free CMA-ES. The (1+1)-variant of CMA-ES establishes a direct link to GaA by combining the classical (1+1)-ES with a rank-one update of the covariance matrix [8]. In the next subsection, we show that Gaussian Adaptation has been designed in a similar spirit, yet grounding its theoretical justification on a different foundation.…”
Section: Evolution Strategies With Covariance Matrix Adaptationmentioning
confidence: 97%
“…For N C → ∞, the covariance stays completely isotropic and GaA becomes equivalent to the (1+1)-ES with a P th -success rule. Keeping N C finite results in an algorithm that is almost equivalent to the (1+1)-CMA-ES [8]. Slight differences, however, remain in deciding when to update the covariance and how to adapt the step size.…”
Section: Gaussian Adaptationmentioning
confidence: 99%
“…The last 3SOME variant we propose in this paper replaces the middle distance exploration with the (1+1)-CMA-ES algorithm presented in [6]. The latter algorithm combines a classic (1+1)-ES scheme with an improved Covariance Matrix Adaptation mechanism [3], where an incremental update of the covariance matrix Cholesky factors is performed instead of computing the Cholesky decomposition.…”
Section: Some With 1+1 Covariance Matrix Adaptation Evolution Strmentioning
confidence: 99%
“…Other relevant algorithms based on the CMA have been proposed in literature, e.g. [4], [5], and [6].…”
mentioning
confidence: 99%
“…For N C → ∞, the covariance remains isotropic and GaA becomes equivalent to the (1+1)-ES with a P th -success rule. Keeping N C finite results in an algorithm that is almost equivalent to the (1+1)-CMA-ES [12]. Four key differences, however, remain.…”
Section: ) Strategy Parameters Of Gaamentioning
confidence: 99%