2019
DOI: 10.1007/s10957-019-01582-z
|View full text |Cite
|
Sign up to set email alerts
|

A Linear Scalarization Proximal Point Method for Quasiconvex Multiobjective Minimization

Abstract: In this paper we propose a linear scalarization proximal point algorithm for solving arbitrary lower semicontinuous quasiconvex multiobjective minimization problems. Under some natural assumptions and using the condition that the proximal parameters are bounded we prove the convergence of the sequence generated by the algorithm and when the objective functions are continuous, we prove the convergence to a generalized critical point. Furthermore, if each iteration minimize the proximal regularized function and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 15 publications
(36 reference statements)
0
5
0
Order By: Relevance
“…The condition T = / 0 has been extensively applied in the convergence analysis of classical method extensions to vector optimization, such as the projected gradient method [13,14,33], steepest descent method [34], proximal point method [35,36] and so on. This assumption is related to the completeness of the image of F, which is standard for ensuring the existence of Pareto optimal points for vector optimization problems [37].…”
Section: Algorithm 42mentioning
confidence: 99%
“…The condition T = / 0 has been extensively applied in the convergence analysis of classical method extensions to vector optimization, such as the projected gradient method [13,14,33], steepest descent method [34], proximal point method [35,36] and so on. This assumption is related to the completeness of the image of F, which is standard for ensuring the existence of Pareto optimal points for vector optimization problems [37].…”
Section: Algorithm 42mentioning
confidence: 99%
“…This assumption has been widely used to analyze the convergence of many numerical methods for vector optimization problems, for example the projected gradient method [15,17,18,19], steepest descent method [28,29] and the proximal point method [21,30] Proof. This result can be obtained from [16, Theorem 1] when the nomonotone line search rule considered therein reduces to the Armijo-like line search rule (3.1).…”
Section: Convergence Analysismentioning
confidence: 99%
“…Afterwards, Fukuda and Graña Drummond established in [19] the full convergence of the sequence to optimal points for convex objective functions. Note that, in many real-life applications, such as, microeconomy and location theory [20,21], one often encounters the situations where the objective function is not convex, e.g. is quasiconvex.…”
Section: Introductionmentioning
confidence: 99%
“…In view of its extensive applications, the research on the numerical algorithms for solving multiobjective optimization problems has received a lot of attention and many iterative methods have been proposed, including projected gradient method [9,10,11], steepest descent method [12,13], proximal point method [14,15,16], conjugate gradient method [17], Newton method [18,19], trust-region method [20,21] and so on.…”
Section: Introductionmentioning
confidence: 99%
“…Among these methods for solving multiobjective optimization, scalarization methods [15,22,23] and descent methods [9,13,19,24] are mainly two different kinds of approaches. Based on the scalarization technique, scalarization methods compute the Pareto or weak Pareto optimal solutions by choosing some parameters in advance and reformulating the original vector-valued problems into the parameterized scalar-valued ones.…”
Section: Introductionmentioning
confidence: 99%