The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2023
DOI: 10.1016/j.amc.2022.127791
|View full text |Cite
|
Sign up to set email alerts
|

Memory gradient method for multiobjective optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 35 publications
0
0
0
Order By: Relevance
“…The objective function in (7) is proper, closed and strongly convex. Therefore, problem (7) admits a unique optimal solution, referred to as the steepest descent direction (see [12]).…”
Section: Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…The objective function in (7) is proper, closed and strongly convex. Therefore, problem (7) admits a unique optimal solution, referred to as the steepest descent direction (see [12]).…”
Section: Preliminariesmentioning
confidence: 99%
“…To our knowledge, several strategies exist for choosing the search direction d k (see [3, 7, 8, 12-14, 22, 24, 33, 40] and references therein). A representative descent algorithm is the work of Fliege and Svaiter [12] in 2000, where t k satisfies the multi-objective Armijo rule and d k is generated by solving the strongly convex quadratic problem (see (7)) or its dual problem (see (10) and (11)). This results in the multi-objective steepest descent (MSD) method.…”
Section: Introductionmentioning
confidence: 99%