Mathematics Without Boundaries 2014
DOI: 10.1007/978-1-4939-1124-0_2
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Direct Search Methods for Blackbox Optimization and Their Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
50
0
1

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 62 publications
(51 citation statements)
references
References 109 publications
0
50
0
1
Order By: Relevance
“…The NOMAD implementation [15] of the MADS algorithm [16] for nonsmooth constrained optimization was used to search for the optimal study design. MADS is designed for blackbox optimization problems [17] in which the cost function and constraints are evaluated by a time-consuming simulation code. The optimization problem had four bound-constrained variables, three of them being continuous, and the fourth one (n 1 / being discrete.…”
Section: Optimal Design Search Algorithmmentioning
confidence: 99%
“…The NOMAD implementation [15] of the MADS algorithm [16] for nonsmooth constrained optimization was used to search for the optimal study design. MADS is designed for blackbox optimization problems [17] in which the cost function and constraints are evaluated by a time-consuming simulation code. The optimization problem had four bound-constrained variables, three of them being continuous, and the fourth one (n 1 / being discrete.…”
Section: Optimal Design Search Algorithmmentioning
confidence: 99%
“…The observables may appear in the cost function or in some of the constraints. Unlike in derivative-free optimization (see [1,12]), in EML the H function is not treated as a black-box. On the contrary, there is an emphasis on exploiting the structure of the Machine Learning model to boost the search process.…”
Section: Introductionmentioning
confidence: 99%
“…The new propagator can be instantiated multiple times to model networks with more than two layers, or it can be combined with Neuron Constraints to encode almost any type of ANN, including recurrent networks. 1 The bounds for the new propagator are based on a non-linear Lagrangian relaxation, with the multipliers being optimized via a subgradient method. Lagrangian relaxation has proved to be a powerful technique for performing reduced-cost filtering in CP, mainly in the context of linear relaxations of integer programs (e.g., [32][33][34]).…”
Section: Introductionmentioning
confidence: 99%
“…Blackbox optimization is used when the objective functions or the constraints of a problem can only be calculated through a computer code, as it is the case in this problem. Blackbox optimization methods have been applied successfully to many engineering problems (Audet, 2014). In the field of hydrology, blackbox optimization has been used to find the optimal locations for GMONs (Alarie et al, 2013), which are devices used to measure snow water equivalent in remote areas of watersheds.…”
Section: Introductionmentioning
confidence: 99%