Numerical Nonsmooth Optimization 2020
DOI: 10.1007/978-3-030-34910-3_6
|View full text |Cite
|
Sign up to set email alerts
|

Gradient Sampling Methods for Nonsmooth Optimization

Abstract: This paper reviews the gradient sampling methodology for solving nonsmooth, nonconvex optimization problems. An intuitively straightforward gradient sampling algorithm is stated and its convergence properties are summarized. Throughout this discussion, we emphasize the simplicity of gradient sampling as an extension of the steepest descent method for minimizing smooth objectives. We then provide overviews of various enhancements that have been proposed to improve practical performance, as well as of several ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
41
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(46 citation statements)
references
References 54 publications
0
41
0
Order By: Relevance
“…Gradient sampling methods are a developing class of algorithms for general non-smooth non-convex optimization; see the recent survey by Burke et al. (2018). These methods attempt to estimate the - subdifferential at a point by evaluating a random sample of gradients in the neighbourhood of and constructing the convex hull of these gradients.…”
Section: Deterministic Methods For Deterministic Objectivesmentioning
confidence: 99%
“…Gradient sampling methods are a developing class of algorithms for general non-smooth non-convex optimization; see the recent survey by Burke et al. (2018). These methods attempt to estimate the - subdifferential at a point by evaluating a random sample of gradients in the neighbourhood of and constructing the convex hull of these gradients.…”
Section: Deterministic Methods For Deterministic Objectivesmentioning
confidence: 99%
“…Central in nonsmooth optimisation are bundle methods, where a subgradient [19] is required at each iterate to construct a linear approximation to the objective function-see [43] for an introduction. A close alternative to bundle methods are gradient sampling methods (see [12] for a recent review by Burke et al), where the descent direction is determined by sampling gradients in a neighbourhood of the current iterate. Curtis and Que [21] formulated a hybrid method between the gradient sampling scheme of [20] and the well-known quasi-Newton method BFGS adapted for nonsmooth problems [49].…”
Section: Related Literature On Nonsmooth Nonconvex Optimisationmentioning
confidence: 99%
“…continuous on an open dense subset of R n , which makes a step forward in nonsmooth optimization field [8]. Curtis and Overton [9] further generalized this idea to the optimization problems with nonsmooth nonlinear constraints or objective function by introducing the SQP algorithm framework.…”
Section: A Sequential Quadratic Programming Algorithm Combined With Gmentioning
confidence: 99%