2004
DOI: 10.1023/b:coap.0000018877.86161.8b
|View full text |Cite
|
Sign up to set email alerts
|

A Projected Gradient Method for Vector Optimization Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
100
0
5

Year Published

2009
2009
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 160 publications
(106 citation statements)
references
References 17 publications
1
100
0
5
Order By: Relevance
“…Meantime, they admitted the possibility that F : X ---t Y takes the value ooc (this is made precise in Section 2), where X is a Hilbert space and C is a closed, convex, and pointed cone in Y with intC "/= 0, where intC denotes the interior of the set C. Such extensions can be traced back to the fashion of extension which existed in a finite-dimensional setting. For example, in nn, see the steepest descent method for multiobjective optimization [2], the same method for general finite-dimensional vector optimization [3], and the projected gradient method for convexly constrained vector optimization [4].…”
Section: Introduction and Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…Meantime, they admitted the possibility that F : X ---t Y takes the value ooc (this is made precise in Section 2), where X is a Hilbert space and C is a closed, convex, and pointed cone in Y with intC "/= 0, where intC denotes the interior of the set C. Such extensions can be traced back to the fashion of extension which existed in a finite-dimensional setting. For example, in nn, see the steepest descent method for multiobjective optimization [2], the same method for general finite-dimensional vector optimization [3], and the projected gradient method for convexly constrained vector optimization [4].…”
Section: Introduction and Discussionmentioning
confidence: 99%
“…As pointed out in [4], the set argmin{ (G(x), z)lx E S} in Theorem 2.1 may be empty for some z E c+ \ {0}.…”
Section: Vy E Y -Ooc -mentioning
confidence: 98%
See 2 more Smart Citations
“…Because of these applications, a lot of literature have been published to study optimality conditions, duality theories and topological properties of solutions of multiobjective optimization problems (see, e.g., [9,18,20,25] and the references therein). Recently some numerical methods for solving convex multiobjective optimization problems have been proposed in following papers: the steepest descent method for multiobjective optimization was dealt with in [14], the extensions of the projective gradient method to the case of convex constrained vector optimization can be found in [16,17]. Bonnel et al [5] constructed a vector-valued proximal point algorithm to investigate convex vector optimization problem in Hilbert space, they generalized the famous Rockafellar's results [23] from scalar case to vector case.…”
Section: Introductionmentioning
confidence: 99%