Proceedings of the 2020 Genetic and Evolutionary Computation Conference 2020
DOI: 10.1145/3377930.3389831
|View full text |Cite
|
Sign up to set email alerts
|

Integrated vs. sequential approaches for selecting and tuning CMA-ES variants

Abstract: When faced with a specific optimization problem, deciding which algorithm to apply is always a difficult task. Not only is there a vast variety of algorithms to select from, but these algorithms are often controlled by many hyperparameters, which need to be suitably tuned in order to achieve peak performance. Usually, the problem of selecting and configuring the optimization algorithm is addressed sequentially, by first selecting a suitable algorithm and then tuning it for the application at hand. Integrated a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

4
4

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 38 publications
0
12
0
Order By: Relevance
“…Our next steps are as follows. First, we will annotate more data sets, starting with those provided by IOHprofiler [1,3] and those of Nevergrad [2]. Further extensions include a web-based GUI, similar to what is shown in Figure 1.…”
Section: Discussionmentioning
confidence: 99%
“…Our next steps are as follows. First, we will annotate more data sets, starting with those provided by IOHprofiler [1,3] and those of Nevergrad [2]. Further extensions include a web-based GUI, similar to what is shown in Figure 1.…”
Section: Discussionmentioning
confidence: 99%
“…Hyperparameter tuning. A second factor of improvement can come from adding hyperparameter tuning into the dynamic process; i.e., when moving from the algorithm selection se ing to a dynamic variant of Combined Algorithm Selection and Hyperparameter optimization (CASH [34,39]). A dynamic CASH approach would allow the algorithms to specialize even more, so they can focus even more on performing as good as possible on their speci c part of the optimization process.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…To properly address the problem of determining the contribution of a single module setting to an existing portfolio of modules, we make use of hyperparameter optimization, which has previously been shown to achieve results comparable to the complete enumeration method, while being much more easily extendable to other hyperparameters [38]. We propose the following roadmap to formalize this procedure, which is designed to be generic, so that it can function with any modular algorithm, hyperparameter tuner, and performance metric:…”
Section: Incremental Assessment Of Module Performancementioning
confidence: 99%
“…Influence and stochasticity of the hyperparameter tuning: While we showed that assessing the impact of an algorithmic component by using a hyperparameter tuning approach provides useful insights, there are several factors which can complicate this approach. Since hyperparameter tuning is a very challenging problem, with many different approaches to solving it, the kind of tuner used will have a large impact on the resulting assessment [38]. In this paper, we used irace, which tends to focus on converging to a single configuration, instead of covering a large set of different solutions.…”
Section: Challengesmentioning
confidence: 99%