2021
DOI: 10.48550/arxiv.2102.09762
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Numerical Performance of Derivative-Free Optimization Methods Based on Finite-Difference Approximations

Abstract: The goal of this paper is to investigate an approach for derivative-free optimization that has not received sufficient attention in the literature and is yet one of the simplest to implement and parallelize. It consists of computing gradients of a smoothed approximation of the objective function (and constraints), and employing them within established codes. These gradient approximations are calculated by finite differences, with a differencing interval determined by the noise level in the functions and a boun… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…The study of nonlinear optimization problems with errors or noise in the function and gradient has attracted attention in recent years, motivated by the use of finite difference approximations to derivatives [19,26,25] and by applications in machine learning; see [15] for a review of some recent work.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The study of nonlinear optimization problems with errors or noise in the function and gradient has attracted attention in recent years, motivated by the use of finite difference approximations to derivatives [19,26,25] and by applications in machine learning; see [15] for a review of some recent work.…”
Section: Literature Reviewmentioning
confidence: 99%
“…That said, one of the strengths of direct-search methods is that they are more readily applicable when an objective function is nonsmooth [2,14] or even discontinuous [31]. Finite-difference approaches approximate derivatives using finite difference schemes, which are then embedded within a gradient-based optimization approach, such as a steepest descent or quasi-Newton method; see, e.g., [30]. Empirical evidence has shown that finite-difference methods can be competitive with model-based methods, at least when one presumes no noise in the function values.…”
Section: ≥0mentioning
confidence: 99%
“…Several suggested methods have given fair outcomes for computing the gradient vector values numerically. See [63][64][65][66][67].…”
Section: Numerical Differentiationmentioning
confidence: 99%