Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation 2013
DOI: 10.1145/2464576.2482693
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of multistart global optimization algorithms on the BBOB noiseless testbed

Abstract: Multi Level Single Linkage is a multistart, stochastic global optimization method which relies on random sampling and local search. In this paper, we benchmarked three variants of the MLSL algorithm by using two gradient based and a derivative-free local search method on the noiseless function testbed. The three methods were also compared with a commercial multistart solver, called OQNLP (OptQuest/NLP).Our experiment showed that, the results may be influenced essentially by the applied local search procedure. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 8 publications
0
8
0
Order By: Relevance
“…The origin of most solvers belonging to this category is the multi level single linkage method (MLSL, Pál, 2013;Rinnooy Kan and Timmer, 1987). It is a stochastic, multistart, global optimizer that relies on random sampling and local searches.…”
Section: Multi-level Approaches (5)mentioning
confidence: 99%
See 1 more Smart Citation
“…The origin of most solvers belonging to this category is the multi level single linkage method (MLSL, Pál, 2013;Rinnooy Kan and Timmer, 1987). It is a stochastic, multistart, global optimizer that relies on random sampling and local searches.…”
Section: Multi-level Approaches (5)mentioning
confidence: 99%
“…It is a stochastic, multistart, global optimizer that relies on random sampling and local searches. Aside from MLSL itself, some of its variants also belong to our portfolio: an interior-point version for constrained nonlinear problems (fmincon, Pál, 2013), a quasi-Newton version, which approximates the Hessian using BFGS (Broyden, 1970) (fminunc, Pál, 2013, and a hybrid variant whose most important improvements are related to its sampling phase (HMLSL, Pál, 2013). The final optimizer belonging to this group is the multilevel coordinate search (MCS, Huyer and Neumaier, 2009), which splits the search space into smaller boxes -each containing a known observation -and then starts local searches from promising boxes.…”
Section: Multi-level Approaches (5)mentioning
confidence: 99%
“…The raw regression data has been collected using Python implementation, while the personalized approach and all the evaluations have been performed using R. Our fixed-budget regression is inspired by [21], but applied here to the more diverse set of algorithms suggested in [24]. Concretely, we aim at predicting the performance of the following 12 algorithms: BrentSTEPqi [36], BrentSTEPrr [36], CMA-ES-CSA [1], HCMA [26], HMLSL [33], IPOP400D [2], MCS [19], MLSL [33], OQNLP [34], fmincon [34], fminunc [34], and BIPOP-CMA-ES [13]. Note here that the latter does not appear in the portfolio analyzed in [24], but it was added since for one algorithm from the original study the raw performance data was missing.…”
Section: Methodsmentioning
confidence: 99%
“…The algorithm portfolio we chose for this work was suggested in [33] for its diversity. It consists of the following 12 algorithms: BrentSTEPqi [48], BrentSTEPrr [48], CMA-ES-CSA [1], HCMA [36], HMLSL [45], IPOP400D [2], MCS [23], MLSL [45], OQNLP [46], fmincon [46], fminunc [46], and BIPOP-CMA-ES [16]. Note that, due to the unavailability of the raw performance data for one of the algorithms in the original study, the BIPOP-CMA-ES was added instead of the missing one.…”
Section: Performance Regression 21 Experimental Setupmentioning
confidence: 99%