Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation 2013
DOI: 10.1145/2464576.2482692
|View full text |Cite
|
Sign up to set email alerts
|

Benchmarking a hybrid multi level single linkagealgorithm on the bbob noiseless testbed

Abstract: Multi Level Single Linkage (MLSL) is a well known stochastic global optimization method. In this paper, a new hybrid variant (HMLSL) of the MLSL algorithm is presented. The most important improvements are related to the sampling phase: the sample is generated from a Sobol quasi-random sequence and a few percent of the population is further improved by using crossover and mutation operators like in a traditional differential evolution (DE) method.The aim of this study is to evaluate the performance of the new H… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 10 publications
(7 reference statements)
0
4
0
Order By: Relevance
“…The raw regression data has been collected using Python implementation, while the personalized approach and all the evaluations have been performed using R. Our fixed-budget regression is inspired by [21], but applied here to the more diverse set of algorithms suggested in [24]. Concretely, we aim at predicting the performance of the following 12 algorithms: BrentSTEPqi [36], BrentSTEPrr [36], CMA-ES-CSA [1], HCMA [26], HMLSL [33], IPOP400D [2], MCS [19], MLSL [33], OQNLP [34], fmincon [34], fminunc [34], and BIPOP-CMA-ES [13]. Note here that the latter does not appear in the portfolio analyzed in [24], but it was added since for one algorithm from the original study the raw performance data was missing.…”
Section: Methodsmentioning
confidence: 99%
“…The raw regression data has been collected using Python implementation, while the personalized approach and all the evaluations have been performed using R. Our fixed-budget regression is inspired by [21], but applied here to the more diverse set of algorithms suggested in [24]. Concretely, we aim at predicting the performance of the following 12 algorithms: BrentSTEPqi [36], BrentSTEPrr [36], CMA-ES-CSA [1], HCMA [26], HMLSL [33], IPOP400D [2], MCS [19], MLSL [33], OQNLP [34], fmincon [34], fminunc [34], and BIPOP-CMA-ES [13]. Note here that the latter does not appear in the portfolio analyzed in [24], but it was added since for one algorithm from the original study the raw performance data was missing.…”
Section: Methodsmentioning
confidence: 99%
“…While some algorithms occur in both Figures 6 and 7, many are included only once, indicating that they are relatively good choices for one part of the search, but not the remainder. e clearest example of this is HMLSL [30], which performs very well as A 1 , but has relatively high I 2 -values. is is caused by the fact that this algorithm typically converges quickly to a value close to the optimum, but has issues in the nal exploitation phase, thus only being bene cial to use at the start of the search.…”
Section: Selected Algorithm Combinationsmentioning
confidence: 99%
“…To show that dynamic algorithm selection is also applicable to smaller portfolio's, we limit ourselves to 5 algorithms. ese are representative of some widely used algorithm families: Nelder-Doerr [8], DE-Auto [40], Bipop-aCMA-Step [22], HMLSL [30] and PSO-BFGS [41].With this reduced algorithm portfolio, we can study the improvements over their respective VBS static in more detail, and nd interesting algorithms combinations to explore further.…”
Section: Small Portfolio: Case Studymentioning
confidence: 99%
“…The algorithm portfolio we chose for this work was suggested in [33] for its diversity. It consists of the following 12 algorithms: BrentSTEPqi [48], BrentSTEPrr [48], CMA-ES-CSA [1], HCMA [36], HMLSL [45], IPOP400D [2], MCS [23], MLSL [45], OQNLP [46], fmincon [46], fminunc [46], and BIPOP-CMA-ES [16]. Note that, due to the unavailability of the raw performance data for one of the algorithms in the original study, the BIPOP-CMA-ES was added instead of the missing one.…”
Section: Performance Regression 21 Experimental Setupmentioning
confidence: 99%