2021
DOI: 10.1007/s10957-020-01803-w
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Descent Method for Locally Lipschitz Multiobjective Optimization Problems

Abstract: We present an efficient descent method for unconstrained, locally Lipschitz multiobjective optimization problems. The method is realized by combining a theoretical result regarding the computation of descent directions for nonsmooth multiobjective optimization problems with a practical method to approximate the subdifferentials of the objective functions. We show convergence to points which satisfy a necessary condition for Pareto optimality. Using a set of test problems, we compare our method with the multiob… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

4
5

Authors

Journals

citations
Cited by 18 publications
(21 citation statements)
references
References 34 publications
0
21
0
Order By: Relevance
“…The continuation method we present in this paper allows researchers and practitioners with an interest in sparse optimization to systematically calculate the entire set of compromise solutions between sparsity and the main objective, thereby allowing for much more insight as well as informed model selection than weighted-sum approaches or descent methods providing single Pareto optima [21], [40]. The presented approach extends the well-known homotopy methods to the much more complex nonlinear problem setting, where weighting approaches fail due to non-convexity.…”
Section: Discussionmentioning
confidence: 99%
“…The continuation method we present in this paper allows researchers and practitioners with an interest in sparse optimization to systematically calculate the entire set of compromise solutions between sparsity and the main objective, thereby allowing for much more insight as well as informed model selection than weighted-sum approaches or descent methods providing single Pareto optima [21], [40]. The presented approach extends the well-known homotopy methods to the much more complex nonlinear problem setting, where weighting approaches fail due to non-convexity.…”
Section: Discussionmentioning
confidence: 99%
“…There are various different methods for solving nonsmooth MOPs, see, e.g., [26,15,10]. If both f and g are locally Lipschitz continuous, then the following theorem yields a necessary condition for Pareto optimality based on the Clarke subdifferentials (cf.…”
Section: Multiobjective Optimizationmentioning
confidence: 99%
“…For these remaining assumptions we consider the following example, where the feasible set is given by continuously differentiable but nonconvex inequality constraints. It is inspired by problem (15) in [41].…”
Section: Examplesmentioning
confidence: 99%
“…What is more, the structure of the Pareto Set can be exploited to find multiple solutions [12,13]. There are also methods for non-smooth problems [14,15] and multiobjective direct-search variants [16,17]. Both scalarization and descent techniques may be included in Evolutionary Algorithms (EA) [18][19][20][21][22].…”
Section: Introductionmentioning
confidence: 99%