2022
DOI: 10.1007/s10479-021-04498-y
|View full text |Cite
|
Sign up to set email alerts
|

Essentials of numerical nonsmooth optimization

Abstract: Approximately sixty years ago two seminal findings, the cutting plane and the subgradient methods, radically changed the landscape of mathematical programming. They provided, for the first time, the practical chance to optimize real functions of several variables characterized by kinks, namely by discontinuities in their derivatives. Convex functions, for which a superb body of theoretical research was growing in parallel, naturally became the main application field of choice. The aim of the paper is to give a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 126 publications
(136 reference statements)
0
1
0
Order By: Relevance
“…Rooted in the field of non-smooth optimization [19], the (gradient-based) optimization of discontinuous programs has recently seen major interest across many domains, for example machine learning [20], computer graphics [8] and optimal control [9]. Besides gradient-free approaches such as genetic algorithms or the Nelder-Mead method [15], the state of the art in non-smooth optimization includes bundle methods, which augment the subgradient method through the exploitation of past subgradient information [21] and gradient sampling methods exploiting piecewise differentiability [22].…”
Section: Related Workmentioning
confidence: 99%
“…Rooted in the field of non-smooth optimization [19], the (gradient-based) optimization of discontinuous programs has recently seen major interest across many domains, for example machine learning [20], computer graphics [8] and optimal control [9]. Besides gradient-free approaches such as genetic algorithms or the Nelder-Mead method [15], the state of the art in non-smooth optimization includes bundle methods, which augment the subgradient method through the exploitation of past subgradient information [21] and gradient sampling methods exploiting piecewise differentiability [22].…”
Section: Related Workmentioning
confidence: 99%