2012
DOI: 10.1007/s10898-012-9951-y
|View full text |Cite
|
Sign up to set email alerts
|

Derivative-free optimization: a review of algorithms and comparison of software implementations

Abstract: This paper addresses the solution of bound-constrained optimization problems using algorithms that require only the availability of objective function values but no derivative information. We refer to these algorithms as derivative-free algorithms. Fueled by a growing number of applications in science and engineering, the development of derivativefree optimization algorithms has long been studied, and it has found renewed interest in recent time. Along with many derivative-free algorithms, many software implem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
583
0
2

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 1,038 publications
(589 citation statements)
references
References 104 publications
4
583
0
2
Order By: Relevance
“…Evolution Strategies are heuristic methods designed for the solution of global optimization problems (with continuous variables) that have performed well in terms of the quality of the final point computed (see [2,3,22,39]). However, like any other method for global optimization, ES's suffer from the curse of dimensionality, meaning that their performance is satisfactory on low dimensional problems, but deteriorates as the dimensionality of the search space increases [29].…”
Section: Search Space Reductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Evolution Strategies are heuristic methods designed for the solution of global optimization problems (with continuous variables) that have performed well in terms of the quality of the final point computed (see [2,3,22,39]). However, like any other method for global optimization, ES's suffer from the curse of dimensionality, meaning that their performance is satisfactory on low dimensional problems, but deteriorates as the dimensionality of the search space increases [29].…”
Section: Search Space Reductionmentioning
confidence: 99%
“…CMA-ES [23] (where CMA stands for Covariance Matrix Adaptation) is regarded as one of the best in the class (µ, λ)-ES in terms of numerical performance [2,3,22,39]. More precisely, CMA-ES belongs to the ES family denoted by (µ/µ W , λ)-ES, where the subscript 'W' indicates the use of 'recombination' via weights.…”
Section: Evolution Strategies and Cma-esmentioning
confidence: 99%
See 1 more Smart Citation
“…More recently, significant advances in mathematical analysis, computer power, automatic differentiation and global optimization theory and algorithms have en-abled the optimization of complex and large nonlinear problems with theoretical guarantee. However, there is still high interest in CDFO methods because they are suitable for problems which deterministic global optimization methods are unable to handle due to lack of information, noise, non-smoothness and discontinuities (Kolda et al, 2003;Conn et al, 2009b;Rios and Sahinidis, 2013;Martelli and Amaldi, 2014). In the first textbook on Derivative-Free Optimization, Conn et al (2009b) recognize that optimization without derivatives is one of the very challenging open problems in science and engineering, which has a vast number of potential practical applications.…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, the global minimal value is usually unknown. The derivative may not exist or may be unavailable (for instance, in case of so called "black box " problems, usually all one have is the possibility of compute the value f (x) at given state x ∈ A and this computation often requires much effort ), and hence many methods belong to the class of derivative-free algorithms, [27]. Because the given method uses poor information on f , its convergence may have very undesired properties based on the following issue: the closer to the optimum, the harder to generate a "better" (in sense of the cost function) state.…”
Section: Introductionmentioning
confidence: 99%