Proceedings of the Genetic and Evolutionary Computation Conference 2017
DOI: 10.1145/3071178.3071288
|View full text |Cite
|
Sign up to set email alerts
|

On the runtime analysis of generalised selection hyper-heuristics for pseudo-boolean optimisation

Abstract: Selection hyper-heuristics are randomised search methodologies which choose and execute heuristics from a set of low-level heuristics. Recent time complexity analyses for the L O benchmark function have shown that the standard simple random, permutation, random gradient, greedy and reinforcement learning selection mechanisms show no e ects of learning. e idea behind the learning mechanisms is to continue to exploit the currently selected heuristic as long as it is successful. However, the probability that a pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
35
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 31 publications
(37 citation statements)
references
References 20 publications
(36 reference statements)
2
35
0
Order By: Relevance
“…Consequently, this new hyper-heuristic outperforms the previously investigated ones when c is large enough. For c tending to infinity, its performance approaches the best-possible performance that can be obtained from the two mutation operators, which is, as also shown in [LOW17], ln(2)+1 4 n 2 + o(n 2 ). The following variant of this result appeared in the preprint [LOW18].…”
Section: Beyond Mixing: Advanced Selection Mechanisms 10mentioning
confidence: 54%
See 3 more Smart Citations
“…Consequently, this new hyper-heuristic outperforms the previously investigated ones when c is large enough. For c tending to infinity, its performance approaches the best-possible performance that can be obtained from the two mutation operators, which is, as also shown in [LOW17], ln(2)+1 4 n 2 + o(n 2 ). The following variant of this result appeared in the preprint [LOW18].…”
Section: Beyond Mixing: Advanced Selection Mechanisms 10mentioning
confidence: 54%
“…Given that the probabilities to find a true improvement are very low in this discrete optimization problem, one would expect that the four selection mechanisms all use the two operators in a very balanced manner and thus lead to very similar running times. This is indeed the first set of results in the remarkable work of Lissovoi, Oliveto, and Warwicker [LOW17]. Building on the precise analysis method of [BDN10] instead of the fitness level method, they show that the expected running time for all four selection mechanisms is 1 2 ln(3) n 2 + o(n 2 ) ≈ 0.549n 2 .…”
Section: Beyond Mixing: Advanced Selection Mechanisms 10mentioning
confidence: 61%
See 2 more Smart Citations
“…Before describing this algorithm in detail, we note that already allowing 1-and 2-bitflips (i.e., mut with = 1 and = 2) decreases the optimal 1 + n 2 /2 expected optimization time of static unary unbiased algorithms to about 0.4233n 2 [LOW17] (the fact that mut is defined slightly different in [LOW17] has a negligible impact on this result). This running time can be further reduced by allowing larger step sizes.…”
Section: Leadingonesmentioning
confidence: 99%