2014
DOI: 10.1137/120895056
|View full text |Cite
|
Sign up to set email alerts
|

Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms

Abstract: The Mesh Adaptive Direct Search (MADS) class of algorithms is designed for nonsmooth optimization, where the objective function and constraints are typically computed by launching a time-consuming computer simulation. Each iteration of a MADS algorithm attempts to improve the current best-known solution by launching the simulation at a finite number of trial points. Common implementations of MADS generate 2n trial points at each iteration, where n is the number of variables in the optimization problem. The obj… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 47 publications
(34 citation statements)
references
References 31 publications
0
34
0
Order By: Relevance
“…A recent extension of the BOBYQA algorithm for mixed variable programming has been published which uses quadratic approximations and local integer search, with guaranteed identification of locally optimal points (Newby and Ali, 2014). Quadratic models are also used to represent inequality constraints in conjunction with mesh-adaptive direct-search (Audet et al, 2014), in order to expedite local convergence with a fewer number of samples. A representative collection of interpolating and non-interpolating surrogate models used for all model-based methods are reported in Table 9. 6.2.…”
Section: Model-based Methodsmentioning
confidence: 99%
“…A recent extension of the BOBYQA algorithm for mixed variable programming has been published which uses quadratic approximations and local integer search, with guaranteed identification of locally optimal points (Newby and Ali, 2014). Quadratic models are also used to represent inequality constraints in conjunction with mesh-adaptive direct-search (Audet et al, 2014), in order to expedite local convergence with a fewer number of samples. A representative collection of interpolating and non-interpolating surrogate models used for all model-based methods are reported in Table 9. 6.2.…”
Section: Model-based Methodsmentioning
confidence: 99%
“…where Ψ represents the total number of the power allocation problem constraints and in our work Ψ = 2F + M f (1 + N F ), 0 < ξ 1 is the stopping criterion, t is the accuracy initial point, and µ is used to update the accuracy of interior method [30]. Moreover, the subcarrier allocation problem complexity is related to the number of variables and constraints since the NOMAD solver is used where the total number of variables is equal to M f N F and the number of constraints of the subcarrier allocation problem is equal to [29].…”
Section: A Complexity Of the Solution Algorithmmentioning
confidence: 99%
“…and show how to construct such a positive basis motivated by an optimization problem occurring in molecular geometry. An application using positive bases and the cosine measure in improving the performance of a direct search algorithm is shown in [4].…”
Section: Introductionmentioning
confidence: 99%