2021
DOI: 10.48550/arxiv.2105.08541
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DACBench: A Benchmark Library for Dynamic Algorithm Configuration

Abstract: Dynamic Algorithm Configuration (DAC) aims to dynamically control a target algorithm's hyperparameters in order to improve its performance. Several theoretical and empirical results have demonstrated the benefits of dynamically controlling hyperparameters in domains like evolutionary computation, AI Planning or deep learning. Replicating these results, as well as studying new methods for DAC, however, is difficult since existing benchmarks are often specialized and incompatible with the same interfaces. To fac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 20 publications
(37 reference statements)
0
3
0
Order By: Relevance
“…Looking at Context sets, we see that PCG is heavily used in ZSG environments, featuring in 21 (38%) environments. Many environments combine PCG components with controllable variation (Chevalier-Boisvert et al, 2019;Côté et al, 2019;Juliani et al, 2019;Li et al, 2021b;Team et al, 2021;Xue et al, 2021;Fortunato et al, 2019;Eimer et al, 2021;Bapst et al, 2019). Most environments have several different kinds of factors of variation within their context set.…”
Section: Trends In Environmentsmentioning
confidence: 99%
“…Looking at Context sets, we see that PCG is heavily used in ZSG environments, featuring in 21 (38%) environments. Many environments combine PCG components with controllable variation (Chevalier-Boisvert et al, 2019;Côté et al, 2019;Juliani et al, 2019;Li et al, 2021b;Team et al, 2021;Xue et al, 2021;Fortunato et al, 2019;Eimer et al, 2021;Bapst et al, 2019). Most environments have several different kinds of factors of variation within their context set.…”
Section: Trends In Environmentsmentioning
confidence: 99%
“…when adapting strategy parameters of CMA-ES [30,33]. Because of this, most approaches for DAC are based on reinforcement learning [3]. However, the problem of DAC, and dynAS more specifically, can also be viewed as a hyperparameter optimization problem.…”
Section: Dynasmentioning
confidence: 99%
“…of varying complexity (number of variables and problem instances) for tasks of runtime or quality configuration. For DAC, the DACBench has been proposed (Eimer et al, 2021), although this does not support DAC settings envisioned, e.g., by hyper-reactive search. As an alternative to such libraries, AC methods can also be benchmarked by using surrogate models that are trained on test instances in advance, resulting in cheaper evaluations when testing (Eggensperger et al, 2018).…”
Section: Novel Benchmarksmentioning
confidence: 99%