The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2021
DOI: 10.48550/arxiv.2109.11296
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Conditional gradient method for vector optimization

Abstract: In this paper, we propose an extension of the classical Frank-Wolfe method for solving constrained vector optimization problems with respect to a partial order induced by a closed, convex and pointed cone with nonempty interior. In the proposed method, the construction of auxiliary subproblem is based on the well-known oriented distance function. Two types of stepsize strategies including Armijio line search and adaptive stepsize are used. It is shown that every accumulation point of the generated sequences sa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 32 publications
0
1
0
Order By: Relevance
“…, the (L, C, e)-smoothness reduces to condition (A) in [31]. If C = R + , the (L, C, e)-smoothness and (µ, C, e)-strong convexity correspond to relative L-smoothness and µ-strong convexity in [21], respectively.…”
Section: Relative Smoothness and Relative Strong Convexitymentioning
confidence: 99%
“…, the (L, C, e)-smoothness reduces to condition (A) in [31]. If C = R + , the (L, C, e)-smoothness and (µ, C, e)-strong convexity correspond to relative L-smoothness and µ-strong convexity in [21], respectively.…”
Section: Relative Smoothness and Relative Strong Convexitymentioning
confidence: 99%