Proceedings of the Genetic and Evolutionary Computation Conference 2018
DOI: 10.1145/3205455.3205553
|View full text |Cite
|
Sign up to set email alerts
|

Significance-based estimation-of-distribution algorithms

Abstract: Estimation-of-distribution algorithms (EDAs) are randomized search heuristics that maintain a probabilistic model of the solution space. This model is updated from iteration to iteration, based on the quality of the solutions sampled according to the model. As previous works show, this short-term perspective can lead to erratic updates of the model, in particular, to bit-frequencies approaching a random boundary value. Such frequencies take long to be moved back to the middle range, leading to significant perf… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
29
0
2

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
2
2

Relationship

3
4

Authors

Journals

citations
Cited by 26 publications
(32 citation statements)
references
References 40 publications
(60 reference statements)
1
29
0
2
Order By: Relevance
“…Recently EDAs have drawn a growing attention from the theory community of evolutionary computation [10,17,26,44,46,25,45,27,12,31]. The aim of the theoretical analyses of EDAs in general is to gain insights into the behaviour of the algorithms when optimising an objective function, especially in terms of the optimisation time, that is the number of function evaluations, required by the algorithm until an optimal solution has been found for the first time.…”
Section: Introductionmentioning
confidence: 99%
“…Recently EDAs have drawn a growing attention from the theory community of evolutionary computation [10,17,26,44,46,25,45,27,12,31]. The aim of the theoretical analyses of EDAs in general is to gain insights into the behaviour of the algorithms when optimising an objective function, especially in terms of the optimisation time, that is the number of function evaluations, required by the algorithm until an optimal solution has been found for the first time.…”
Section: Introductionmentioning
confidence: 99%
“…While rigorous runtime analyses provide deep insights into the performance of randomised search heuristics, it is highly challenging even for simple algorithms on toy functions. Most current runtime results merely concern univariate EDAs on functions like OneMax [32,51,36,53,40], LeadingOnes [15,22,37,53,38], BinVal [52,37] and Jump [26,11,12], hoping that this provides valuable insights into the development of new techniques for analysing multivariate variants of EDAs and the behaviour of such algorithms on easy parts of more complex problem spaces [13]. There are two main reasons accounted for this.…”
mentioning
confidence: 99%
“…It has been observed that the stability of an algorithm can lead to difficulties when solving problems in which the fitness only gives a weak signal on what is the right value for a bit-position. In [DK18a], it was proven that the scGA, a version of the cGA artificially made stable, has a runtime of exp(Ω(min{n, K})) on the OneMax benchmark function when the hypothetical population size is K. For the convex search algorithm (CSA), an at least super-polynomial runtime was shown for the optimization of OneMax [DK18b].…”
Section: Sec:onemaxmentioning
confidence: 99%
“…Analogous results hold for the optimization of the BinaryValue function (Theorem thm:BDEforBVwAssumption 24 and Lemma lem:BVassumiBDE 25). Although stability is generally a desirable property (see [FKK16,DK18a] for examples how stable EDAs can outperform common EDAs, which are all unstable), stability can make it hard to find the optimal values of decision variables with small influence on the objective function. We take the OneMax function as an example, and prove that the expected runtime is at least exponential in the dimension when we initialize the population by setting each bit to 1 with probability 0.6 (Theorem thm:rtlargeprob 27).…”
Section: Introductionmentioning
confidence: 99%