1997
DOI: 10.1137/s1052623494279122
|View full text |Cite
|
Sign up to set email alerts
|

Newton Methods For Large-Scale Linear Inequality-Constrained Minimization

Abstract: Abstract. Newton methods of the linesearch type for large-scale minimization subject to linear inequality constraints are discussed. The purpose of the paper is twofold: (i) to give an active-settype method with the ability to delete multiple constraints simultaneously and (ii) to give a relatively short general convergence proof for such a method. It is also discussed how multiple constraints can be added simultaneously. The approach is an extension of a previous work by the same authors for equality-constrai… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0
1

Year Published

1998
1998
2018
2018

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(10 citation statements)
references
References 22 publications
0
9
0
1
Order By: Relevance
“…Algorithms which use such negative curvature directions can be made to converge globally to a second-order critical point using either a linesearch (see, e.g. [4,7,8,13,15 -171) or a trust region (see, e.g. [5,181) approach.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Algorithms which use such negative curvature directions can be made to converge globally to a second-order critical point using either a linesearch (see, e.g. [4,7,8,13,15 -171) or a trust region (see, e.g. [5,181) approach.…”
Section: Introductionmentioning
confidence: 99%
“…At each iteration, such algorithms determine a pair of descent directions, (sk, dk) where, loosely speaking, sk represents a direction calculated from positive curvature information given by the Hessian matrix, and dk is a negative curvature direction. These two directions are cornbined to define trajectories of the form NEGATIVE CURVATURE DIRECTIONS 77 [7,8]. A new point is determined by taking a "suitable" step along the relevant trajectory.…”
Section: Introductionmentioning
confidence: 99%
“…The extension of good box-constraint or linear-constraint solvers to the case in which the objective function has the PHR form is an interesting subject of research. Especially interesting should be the extension of the box-constraint conjugate-gradient solver described in [25], the linearly-constrained minimization algorithm of Forsgren and Murray [21] and the extension of interior point box-constraint approaches. We believe that taking profit in a clever way of second-order information will cause general algorithmic improvements, independently of convergence to second-order criticality.…”
Section: Algencanmentioning
confidence: 99%
“…An example of an SQP which uses an active-set method for the subproblem is SNOPT (Gill et al, 2005). Active-set methods for linearly constrained optimization problems with a general nonlinear objective function are presented in the works of Murtagh and Saunders (1978) and Forsgren and Murray (1997).…”
Section: Introductionmentioning
confidence: 99%