2016
DOI: 10.1007/978-3-319-31471-6_9
|View full text |Cite
|
Sign up to set email alerts
|

Parameter Setting for Multicore CMA-ES with Large Populations

Abstract: The goal of this paper is to investigate on the overall performance of CMA-ES, when dealing with a large number of coresconsidering the direct mapping between cores and individuals-and to empirically find the best parameter strategies for a parallel machine. By considering the problem of parameter setting, we empirically determine a new strategy for CMA-ES, and we investigate whether Self-CMA-ES (a self-adaptive variant of CMA-ES) could be a viable alternative to CMA-ES when using parallel computers with a coa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 22 publications
(60 reference statements)
0
3
0
Order By: Relevance
“…Having said this, metaheuristic algorithms still often need tuning of their hyperparameters to efficiently solve a particular problem given a set of time and computational budget constraints. It is widely understood that the performance of metaheuristic optimisation algorithms, given these constraints are strongly correlated with the values given to their hyperparameters (Belkhir et al 2016). In an attempt to optimise the performance of these algorithms, the hyperparameters are found by tuning.…”
Section: Hyperparameter Tuning and Metaheuristic Optimiser Selectionmentioning
confidence: 99%
See 2 more Smart Citations
“…Having said this, metaheuristic algorithms still often need tuning of their hyperparameters to efficiently solve a particular problem given a set of time and computational budget constraints. It is widely understood that the performance of metaheuristic optimisation algorithms, given these constraints are strongly correlated with the values given to their hyperparameters (Belkhir et al 2016). In an attempt to optimise the performance of these algorithms, the hyperparameters are found by tuning.…”
Section: Hyperparameter Tuning and Metaheuristic Optimiser Selectionmentioning
confidence: 99%
“…Such choices may be perfectly fine for the paper as it may not have been in the scope of the paper to do so. However, it has often been shown that tuning can be tremendously beneficial for some metaheuristics (Belkhir et al 2016, Pedersen 2010a, Tanabe and Fukunaga 2015 particularly when computational cost limitations exist, as do in most real-world applications.…”
Section: Hyperparameter Tuning and Metaheuristic Optimiser Selectionmentioning
confidence: 99%
See 1 more Smart Citation