Proceedings of the Genetic and Evolutionary Computation Conference 2021
DOI: 10.1145/3449639.3459406
|View full text |Cite
|
Sign up to set email alerts
|

The impact of hyper-parameter tuning for landscape-aware performance regression and algorithm selection

Abstract: Automated algorithm selection and configuration methods that build on exploratory landscape analysis (ELA) are becoming very popular in Evolutionary Computation. However, despite a significantly growing number of applications, the underlying machine learning models are often chosen in an ad-hoc manner.We show in this work that three classical regression methods are able to achieve meaningful results for ELA-based algorithm selection. For those three models -random forests, decision trees, and bagging decision … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 21 publications
(7 citation statements)
references
References 48 publications
0
7
0
Order By: Relevance
“…The performance of our selectors may be improved by tuning the Random Forests or by using alternative regression techniques [11]. Nevertheless, even if the performances of the selectors increase, the possible gain is relatively low.…”
Section: Discussionmentioning
confidence: 99%
“…The performance of our selectors may be improved by tuning the Random Forests or by using alternative regression techniques [11]. Nevertheless, even if the performances of the selectors increase, the possible gain is relatively low.…”
Section: Discussionmentioning
confidence: 99%
“…Regression Models for Algorithm Performance Prediction. For the learning process, we considered random forest (RF) regression [2], as it provides promising results for algorithm performance prediction [12] and is one of the most commonly used algorithms for algorithm performance prediction studies in evolutionary computation. The RF algorithm was used as implemented by the scikit-learn package [20] in Python.…”
Section: Experimental Designmentioning
confidence: 99%
“…Here, we are learning Random Forest (RF) regression models where we also considered tuning part of the hyperparameters. Previous studies have already shown that RF can provide promising results dealing with automated performance prediction [17]. For hyperparameter tuning, we employed the grid search methodology, which performs an exhaustive search over a manually selected finite subset of candidate solutions of the hyperparameter space of the algorithm.…”
Section: Regression Model and Its Hyperparametersmentioning
confidence: 99%