The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2015
DOI: 10.1007/s10589-015-9753-5
|View full text |Cite
|
Sign up to set email alerts
|

Mesh adaptive direct search with second directional derivative-based Hessian update

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
22
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(23 citation statements)
references
References 23 publications
1
22
0
Order By: Relevance
“…It was commonplace for Nomad to use thousands of seconds of CPU time, with 35 instances reaching the imposed limit of 60 minutes. This is not unexpected, as the slower convergence speed of the MADS algorithm has been observed in previous literature [13,57,63]. The number of function calls is more comparable; the average was 57,857 for DFO-VU and 41,895 for Nomad.…”
Section: Benchmark Of Dfo Solverssupporting
confidence: 83%
See 1 more Smart Citation
“…It was commonplace for Nomad to use thousands of seconds of CPU time, with 35 instances reaching the imposed limit of 60 minutes. This is not unexpected, as the slower convergence speed of the MADS algorithm has been observed in previous literature [13,57,63]. The number of function calls is more comparable; the average was 57,857 for DFO-VU and 41,895 for Nomad.…”
Section: Benchmark Of Dfo Solverssupporting
confidence: 83%
“…is the number of digits of accuracy achieved by the solver. We also analyze the ability of each solver in capturing the (known) exact V-dimension, by looking at the cardinality of A(x found ) as in (13), for x found the final point found by each solver, and computing v found = |A(x found )| − 1.…”
Section: Benchmark Of Bundle Solversmentioning
confidence: 99%
“…Second-order global convergence is usually less explored in direct search, even though several algorithms attempt to use second-order aspects in their framework [7,14,31]. In fact, proving second-order results for such zeroth-order methods necessitates to strengthen the aforementioned descent requirements, thus raising the question of their practical relevance.…”
Section: Introductionmentioning
confidence: 99%
“…Bűrmen, Olenšek and Tuma (2015) propose a variant of with a specialized model-based search step where is a strongly convex quadratic model of and ( ) are determined from linear regression models of the constraint functions. Both the search and poll steps are accepted only if they are feasible; this corresponds to the method effectively treating the constraints with an extreme-barrier approach.…”
Section: Methods For Constrained Optimizationmentioning
confidence: 99%