2018
DOI: 10.1109/tevc.2017.2724201
|View full text |Cite
|
Sign up to set email alerts
|

Escaping Local Optima Using Crossover With Emergent Diversity

Abstract: Population diversity is essential for avoiding premature convergence in Genetic Algorithms (GAs) and for the effective use of crossover. Yet the

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
75
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
2

Relationship

4
3

Authors

Journals

citations
Cited by 143 publications
(94 citation statements)
references
References 42 publications
3
75
0
Order By: Relevance
“…Now, with overwhelming probability the number of global optima is bounded by the number of search points with Hamming distance less than n δ / log 3 n from x * . By (8), this number is exp(o(n δ / log n)).…”
Section: Black-box Complexity Lower Bounds For Functions With Many Opmentioning
confidence: 99%
See 3 more Smart Citations
“…Now, with overwhelming probability the number of global optima is bounded by the number of search points with Hamming distance less than n δ / log 3 n from x * . By (8), this number is exp(o(n δ / log n)).…”
Section: Black-box Complexity Lower Bounds For Functions With Many Opmentioning
confidence: 99%
“…Example applications include functions with exp(o(n δ / log n)) local optima, including those where all local optima have at most n δ / log 3 n ones or at most n δ / log 3 n zeros. The latter function class includes the well-known Jump k functions [8,26], where a gap of Hamming distance k has to be "jumped" to reach a global optimum, with parameter k ≤ n δ / log 3 n: here all search points with k zeros are local optima, in addition to the global optimum 1 n . A similar function class Cliff d was used in [5,37,50], where the same holds for d in lieu of k; the difference between these two functions is that in the region "between" local and global optima Jump k has a gradient pointing back towards the local optima whereas Cliff d points towards the global optimum 1 n .…”
Section: Lower Bounds On the Time To Reach Local Optimamentioning
confidence: 99%
See 2 more Smart Citations
“…Recently it has been shown how both the hypermutations with mutation potential and the ageing operator of Opt-IA can lead to considerable speed-ups compared to the performance of well-studied EAs using SBM for standard benchmark functions used in the evolutionary computation community such as Jump, Cliff or Trap [11]. While the performance of hypermutation operators to escape the local optima of these functions is comparable to that of the EAs with high mutation rates that have been increasingly gaining popularity since 2009 [12,13,14,15,16], ageing allows the optimisation of hard instances of Cliff in O(n log n) where n is the problem size. Such a runtime is required by all unbiased unary (mutation-based) randomised search heuristics to optimise any function with unique optimum [17].…”
Section: Introductionmentioning
confidence: 99%