Proceedings of the Genetic and Evolutionary Computation Conference 2017
DOI: 10.1145/3071178.3071317
|View full text |Cite
|
Sign up to set email alerts
|

Improved runtime bounds for the univariate marginal distribution algorithm via anti-concentration

Abstract: Unlike traditional evolutionary algorithms which produce offspring via genetic operators, Estimation of Distribution Algorithms (EDAs) sample solutions from probabilistic models which are learned from selected individuals. It is hoped that EDAs may improve optimisation performance on epistatic fitness landscapes by learning variable interactions. However, hardly any rigorous results are available to support claims about the performance of EDAs, even for fitness functions without epistasis. The expected runtime… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
23
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 31 publications
(25 citation statements)
references
References 20 publications
2
23
0
Order By: Relevance
“…While rigorous runtime analyses provide deep insights into the performance of randomised search heuristics, it is highly challenging even for simple algorithms on toy functions. Most current runtime results merely concern univariate EDAs on functions like OneMax [32,51,36,53,40], LeadingOnes [15,22,37,53,38], BinVal [52,37] and Jump [26,11,12], hoping that this provides valuable insights into the development of new techniques for analysing multivariate variants of EDAs and the behaviour of such algorithms on easy parts of more complex problem spaces [13]. There are two main reasons accounted for this.…”
mentioning
confidence: 99%
“…While rigorous runtime analyses provide deep insights into the performance of randomised search heuristics, it is highly challenging even for simple algorithms on toy functions. Most current runtime results merely concern univariate EDAs on functions like OneMax [32,51,36,53,40], LeadingOnes [15,22,37,53,38], BinVal [52,37] and Jump [26,11,12], hoping that this provides valuable insights into the development of new techniques for analysing multivariate variants of EDAs and the behaviour of such algorithms on easy parts of more complex problem spaces [13]. There are two main reasons accounted for this.…”
mentioning
confidence: 99%
“…The ONEMAX function class is used to analyze how well an EDA performs as a hill climber. The usual expected runtime of an EDA on this function is Θ(n log n) [36,38,63,66].…”
Section: Common Fitness Functionsmentioning
confidence: 99%
“…Recently, two independent improvements of the bound were presented. The first one due to Lehre and Nguyen [38] builds on a refinement of the level-based analysis, carefully using properties of the Poisson-binomial distribution, and is summarized by the following theorem. We emphasize that UMDA always refers to the algorithm with borders 1/n and 1 − 1/n on the frequencies, i. e., Algorithm 2 extended by a step that narrows all frequencies down to the interval [1/n, 1 − 1/n].…”
Section: Upper Bounds For Onemaxmentioning
confidence: 99%
See 2 more Smart Citations