2014
DOI: 10.1590/0101-7438.2014.034.03.0585
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Multiobjective Descent Methods

Abstract: We present a rigorous and comprehensive survey on extensions to the multicriteria setting of three well-known scalar optimization algorithms. Multiobjective versions of the steepest descent, the projected gradient and the Newton methods are analyzed in detail. At each iteration, the search directions of these methods are computed by solving real-valued optimization problems and, in order to guarantee an adequate objective value decrease, Armijo-like rules are implemented by means of a backtracking procedure. U… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
41
0
10

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 58 publications
(54 citation statements)
references
References 37 publications
(45 reference statements)
0
41
0
10
Order By: Relevance
“…We would like to mention that this method finds separate solutions at time and not the whole solution set. It has been noticed by Fukuda and Graña Drummond [24], and Downloaded 09/10/18 to 150.214.182.24. Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php Fliege,Graña Drummond,and Svaiter [22] that we can expect to somehow approximate the solution set by just performing this method for different initial points.…”
Section: Compromise Problemmentioning
confidence: 86%
See 2 more Smart Citations
“…We would like to mention that this method finds separate solutions at time and not the whole solution set. It has been noticed by Fukuda and Graña Drummond [24], and Downloaded 09/10/18 to 150.214.182.24. Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php Fliege,Graña Drummond,and Svaiter [22] that we can expect to somehow approximate the solution set by just performing this method for different initial points.…”
Section: Compromise Problemmentioning
confidence: 86%
“…Finally let us emphasize two advantages of our approach compared to the one in [7] and [19]. Consider the family of scalar-valued functions z : R m → R for each z ∈ R m + \ {0}, given by (24) z (y) := y, z .…”
Section: Final Remarksmentioning
confidence: 99%
See 1 more Smart Citation
“…Heuristics may neither guarantee convergence. To overcome such drawbacks, extensions to the vector-valued setting of classical real-valued methods have been proposed in recent years [10].…”
Section: Introductionmentioning
confidence: 99%
“…Desde el punto de vista práctico, en los últimos años, procedimientos de optimización clásicos en el caso escalar han sido extendidos para resolver problemas multiobjetivos con buenos resultados teóricos. Por ejemplo, en el trabajo [42] se extiende el método de máximo descenso, en los trabajos [34,35,41] podemos encontrar varias versiones del método del gradiente proyectado, en [22] se denen métodos del tipo punto proximal, en [24] presentan un método basado en región de conanza, en [33] proponen un método de Newton para el caso multiobjetivo, en [37] podemos encontrar un algoritmo de penalidad vectorial para resolver el problema en forma exacta y en [36] puede encontrarse una completa revisión sobre métodos de descenso. En todos los trabajos se demuestra convergencia global de la sucesión que generan V VII los algoritmos utilizando hipótesis razonables.…”
unclassified