1999
DOI: 10.1137/s1052623497316828
|View full text |Cite
|
Sign up to set email alerts
|

On the Local Convergence of a Predictor-Corrector Method for Semidefinite Programming

Abstract: An interior point method for monotone linear complementarity problems acting in a wide neighborhood of the central path is presented. The method has O( √ nL)-iteration complexity and is superlinearly convergent even when the problem does not possess a strictly complementary solution.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

1999
1999
2015
2015

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(15 citation statements)
references
References 32 publications
(31 reference statements)
0
15
0
Order By: Relevance
“…For example, superlinear and quadratic convergence results for interior-point methods depend on the strict complementarity assumption, e.g. [43,30,3,37,32]. This is also the case for convergence of the central path to the analytic center of the optimal face, [27].…”
Section: Auxiliary Problem and Regularizationmentioning
confidence: 99%
“…For example, superlinear and quadratic convergence results for interior-point methods depend on the strict complementarity assumption, e.g. [43,30,3,37,32]. This is also the case for convergence of the central path to the analytic center of the optimal face, [27].…”
Section: Auxiliary Problem and Regularizationmentioning
confidence: 99%
“…Strict complementarity is a desirable property of an SDP instance; in fact the strict complementarity of an optimal solution is a necessary condition for superlinear convergence of interior-point methods that take Newton-like steps, see [16], and much recent research has explored what conditions in addition to strict complementarity are needed to guarantee superlinear convergence for different interior-point algorithms [9,10,11,12,15]. However, even for linear programming (which must have a strictly complementary solution), there are instances for which the optimal solutions are nearly non-strictly complementary, and can be made arbitrarily badly so.…”
Section: Non-strict Complementaritymentioning
confidence: 99%
“…Four particular choices for P and M are: P = M = I, giving the Alizadeh-Haeberly-Overton (AHO) direction [3] [13,19]; and P = W −1/2 , M = W −1 , where W is the scaling matrix of (12), giving the Nesterov-Todd (NT) direction [27,28,31].…”
Section: The Monteiro-zhang and Monteiro-tsuchiya Familiesmentioning
confidence: 99%
“…Convergence results for long-step algorithms can be found in Monteiro [19] (who proves a bound of O(n 3/2 ln(1/ )) iterations for two particular search directions), in Monteiro and Zhang [22] (who extend the results of [19] and establish a bound of O(n ln(1/ )) iterations for another search direction) and Monteiro and Tsuchiya [21] (who give a bound of O(n 3/2 ln(1/ )) iterations for all directions in a subclass of the Monteiro-Tsuchiya family). Ji, Potra, and Sheng [12] describe the literature on local convergence and prove convergence of Q-order 1.5 or 2 for predictor-corrector algorithms using certain search directions from the Monteiro-Zhang family. Nor do we discuss the amount of computational work involved in computing our search directions in much detail; see Monteiro and Zanjacomo [24] and Toh [32] for flop counts for some of these directions.…”
Section: Introductionmentioning
confidence: 99%