2019
DOI: 10.1137/19m125772x
|View full text |Cite
|
Sign up to set email alerts
|

Trust-Region Methods for the Derivative-Free Optimization of Nonsmooth Black-Box Functions

Abstract: In this paper we study the minimization of a nonsmooth black-box type function, without assuming any access to derivatives or generalized derivatives and without any knowledge about the analytical origin of the function nonsmoothness. Directional methods have been derived for such problems but to our knowledge no model-based method like a trust-region one has yet been proposed. Our main contribution is thus the derivation of derivative-free trust-region methods for black-box type functions. We propose a trust-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 34 publications
0
11
0
Order By: Relevance
“…Indeed, in successful iterations a better control on the amplitude of the step must be exercised in order to prevent errors to accumulate too much. This allows to prove convergence of the squared trust-region radii series (i.e., the analogous of Lemma 2.1 in [15]), stating convergence to zero of the trust-region radius. Using this property, we manage to show that the proposed modification of the original Basic DFO-TRNS algorithm enjoys almost surely convergence to Clarke stationary points.…”
Section: Trust-region Methods For Stochastic Non-smooth Functionsmentioning
confidence: 88%
See 2 more Smart Citations
“…Indeed, in successful iterations a better control on the amplitude of the step must be exercised in order to prevent errors to accumulate too much. This allows to prove convergence of the squared trust-region radii series (i.e., the analogous of Lemma 2.1 in [15]), stating convergence to zero of the trust-region radius. Using this property, we manage to show that the proposed modification of the original Basic DFO-TRNS algorithm enjoys almost surely convergence to Clarke stationary points.…”
Section: Trust-region Methods For Stochastic Non-smooth Functionsmentioning
confidence: 88%
“…After having analyzed a simple stochastic direct-search method, we focus on a stochastic version of the Basic DFO-TRNS, presented in [15], and analyze its convergence properties under tail-bound probabilistic conditions like the ones used in Section 3. The main difference between this new version and the original Basic DFO-TRNS algorithm is in the presence of an upper bound on the trust-region radius.…”
Section: Trust-region Methods For Stochastic Non-smooth Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…See the book [1] for an excellent treatment of this subject. Bundle methods, as well as trustregion bundle methods, have also been considered in the derivative-free setting; see, for instance, [31,42].…”
Section: Literature Reviewmentioning
confidence: 99%
“…DFBBO relies only on a proximal analysis of candidate solutions close to the current incumbent, and requires neither existence nor values of gradients [6,15]. In this paper we focus on the Mads algorithm [5] implemented in the Nomad solver [2], but other techniques exist (see, e.g., [3,8,16,24]). Mads can be applied to a full discretization of Problem (P Motiv ), even if the discontinuities of the original problem are preserved.…”
Section: Introduction 1motivationmentioning
confidence: 99%