Proceedings of the 2020 Genetic and Evolutionary Computation Conference 2020
DOI: 10.1145/3377930.3390189
|View full text |Cite
|
Sign up to set email alerts
|

Towards dynamic algorithm selection for numerical black-box optimization

Abstract: One of the most challenging problems in evolutionary computation is to select from its family of diverse solvers one that performs well on a given problem. is algorithm selection problem is complicated by the fact that di erent phases of the optimization process require di erent search behavior. While this can partly be controlled by the algorithm itself, there exist large di erences between algorithm performance. It can therefore be bene cial to swap the con guration or even the entire algorithm during the ru… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

4
4

Authors

Journals

citations
Cited by 17 publications
(20 citation statements)
references
References 36 publications
0
18
0
Order By: Relevance
“…We have investigated in this work possibilities to leverage existing benchmark data to derive switch-once dynamic algorithm selection policies. While for some cases the "theoretical" approach suggested in [4] could indeed predict combinations that outperformed the best static solver, the results are less positive for others. One obstacle that hinders an accurate performance prediction are local optima: when the rst algorithm is very good at converging to a local optimum, it is likely to be chosen as 1 .…”
Section: Future Workmentioning
confidence: 91%
See 2 more Smart Citations
“…We have investigated in this work possibilities to leverage existing benchmark data to derive switch-once dynamic algorithm selection policies. While for some cases the "theoretical" approach suggested in [4] could indeed predict combinations that outperformed the best static solver, the results are less positive for others. One obstacle that hinders an accurate performance prediction are local optima: when the rst algorithm is very good at converging to a local optimum, it is likely to be chosen as 1 .…”
Section: Future Workmentioning
confidence: 91%
“…Following the approach suggested in [4] we compute a "theoretical" ERT value for all combinations ( 1 , 2 , q B ), where 1 is the rst algorithm, 2 the second, and q B the target value at which we switch from 1 to 2 . To this end, we simply compute ERT( 1 , %, q B ) + ERT( 2 , %, q 5 ) ERT( 2 , %, q B ), where all these ERT values are based on the performance recodes provided in [7].…”
Section: Informed 1-switch Dynamic Algorithm Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…To solve them, the most appropriate optimization algorithm should be selected and its hyper-parameters should be set. This is a well-known problem of algorithm selection (AS) [1], [2], [3] and algorithm configuration (AC) [4], respectively. The prerequisite to achieve this is performing automated algorithm performance prediction.…”
Section: Introductionmentioning
confidence: 99%
“…The dynAS is expected to unlock the potential benefit from switching among different algorithms online. Related work has been performed on black-box optimization for numeric optimization [25]. Based on the rich BBOB data set [14], [25] investigates the potential improvement that can be achieved from switching between using solvers.…”
Section: Introductionmentioning
confidence: 99%