2010
DOI: 10.1007/s00453-010-9403-3
|View full text |Cite
|
Sign up to set email alerts
|

Log-Linear Convergence and Divergence of the Scale-Invariant (1+1)-ES in Noisy Environments

Abstract: Noise is present in many real-world continuous optimization problems. Stochastic search algorithms such as Evolution Strategies (ESs) have been proposed as effective search methods in such contexts. In this paper, we provide a mathematical analysis of the convergence of a (1 + 1)-ES on unimodal spherical objective functions in the presence of noise. We prove for a multiplicative noise model that for a positive expected value of the noisy objective function, convergence or divergence happens depending on the in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 31 publications
(31 citation statements)
references
References 28 publications
(64 reference statements)
0
31
0
Order By: Relevance
“…We also restrict our work to noisy settings in which noise does not decrease to 0 around the optimum. This constrain makes our work different from [4]. In [5,6] we can find noise models related to ours but the results presented here are not covered by their analysis.…”
Section: Local Noisy Optimizationmentioning
confidence: 60%
See 3 more Smart Citations
“…We also restrict our work to noisy settings in which noise does not decrease to 0 around the optimum. This constrain makes our work different from [4]. In [5,6] we can find noise models related to ours but the results presented here are not covered by their analysis.…”
Section: Local Noisy Optimizationmentioning
confidence: 60%
“…1. Remarks: (i) Informally speaking, our theorem shows that if a scale invariant algorithm converges in the noise-free case, then it also converges in the noisy case with the exponential resampling rule, at least if parameters are large enough (a similar effect of constants was pointed out in [4] in a different setting).…”
Section: Theorem 1 Consider the Fitness Functionmentioning
confidence: 72%
See 2 more Smart Citations
“…It has been shown that they are naturally robust in front of actuator 1 noise (see [14,6]). While other studies refer to noisy fitness values, with noise models such as additive or multiplicative noise, as in [13]. In the work presented here, we focus on the study of noisy functions such that the fitness values are perturbed by additive noise with constant variance all over the domain.…”
Section: State Of the Artmentioning
confidence: 99%