2007
DOI: 10.1016/j.cam.2006.05.034
|View full text |Cite
|
Sign up to set email alerts
|

Feasible generalized monotone line search SQP algorithm for nonlinear minimax problems with inequality constraints

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
12
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(12 citation statements)
references
References 14 publications
0
12
0
Order By: Relevance
“…So, many authors have applied the idea of SQP method to present effective algorithms for solving the minimax problems, such as in Refs. [4][5][6][7][8][9][10][11][12]. It is a key problem of various SQP methods to overcome the so-called Maratos effect [13] under suitable conditions, for example, to solve one or more additional quadratic programs or systems of linear equations, or compute explicit correction directions.…”
Section: Introductionmentioning
confidence: 99%
“…So, many authors have applied the idea of SQP method to present effective algorithms for solving the minimax problems, such as in Refs. [4][5][6][7][8][9][10][11][12]. It is a key problem of various SQP methods to overcome the so-called Maratos effect [13] under suitable conditions, for example, to solve one or more additional quadratic programs or systems of linear equations, or compute explicit correction directions.…”
Section: Introductionmentioning
confidence: 99%
“…Thus, 0 ∈ int D(x * ) and with the use of Theorems 2.3 and 2.6 one can conclude that x * is a local minimiser of problem (38) at which the first order growth condition holds true. However, let us check that a generalised complete alternance does not exist at x * .…”
Section: Constrained Minimax Problemsmentioning
confidence: 76%
“…Let us check optimality conditions at the point x * = 0. Firstly, note that x * is a not an isolated point of problem (38), since for any t ≥ 0 the point x(t) = (0, 0, t) T is feasible. One has I(x * ) = {1, 2}, ∇g 1 (x * ) = (0, 1, 0) T , and…”
Section: Constrained Minimax Problemsmentioning
confidence: 99%
See 1 more Smart Citation
“…Problems with inequality constraints can be reformulated in the above form by introducing slack variables. Moreover, nonlinear constrained programming is significant for solving engineering optimization problems in other forms, for example, the minimax problems (Jian, Quan, and Zhang 2007;Wang and Zhang 2008;Han, Jian, and Li 2011;Jian et al 2014) and dynamic optimization problems (Hu, Ong, and Teo 2002;Mohammed and Zhang 2013;Liu, Li, and Liu 2015;Zhang et al 2015). The filter method, as an alternative to merit functions for nonlinear constrained programming, was first proposed by Fletcher and Leyffer (2002).…”
Section: Introductionmentioning
confidence: 99%