2014
DOI: 10.1016/j.compchemeng.2014.09.013
|View full text |Cite
|
Sign up to set email alerts
|

An interior-point method for efficient solution of block-structured NLP problems using an implicit Schur-complement decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
32
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 46 publications
(32 citation statements)
references
References 18 publications
(21 reference statements)
0
32
0
Order By: Relevance
“…if the original KKT system and each K s block satisfies the inertia condition for descent [11,19]. This property enables the use of a PCG procedure to solve the Schur system [11], leading to the implicit Schur complement method. This approach avoids both the explicit formation and factorization of the dense Schur complement matrix.…”
Section: Efficient Parallel Schur Complement Methods For Stochastic Prmentioning
confidence: 99%
See 2 more Smart Citations
“…if the original KKT system and each K s block satisfies the inertia condition for descent [11,19]. This property enables the use of a PCG procedure to solve the Schur system [11], leading to the implicit Schur complement method. This approach avoids both the explicit formation and factorization of the dense Schur complement matrix.…”
Section: Efficient Parallel Schur Complement Methods For Stochastic Prmentioning
confidence: 99%
“…There is, to the best knowledge of the author, no efficient modeling language supporting parallel evaluations of functions and gradients for general NLP problems. However, for structured problems such as stochastic programs, Kang et al [11] and Zavala et al [10] build a single AMPL [14] model instance for each scenario and evaluate all these instances in parallel. Several packages (e.g., PySP [15], StochJuMP [16]) have also been developed to support the parallel evaluation of functions and gradients for structured NLP problems.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…1 If the solution vector (q 0 , q S ) does not exactly solve (15), it will induce a residual vector that we define as T r :…”
Section: Clustering-based Preconditionermentioning
confidence: 99%
“…More recently, Zavala et al [19] have demonstrated a parallel primal-dual interior-point approach to tackle discretized multi-period dynamic optimization formulations. This has ultimately led to general interior-point approaches to handle discretized nominal dynamic optimization formulations [20] and structured NLP formulations [21]. All of these previously noted studies have been on structured NLP techniques involving explicit objective and constraint functionals; however, our particular interest in the present paper is on solution techniques involving implicit or embedded functionals, which require a secondary solution algorithm for evaluation within the NLP constraints.…”
Section: Problem Statementmentioning
confidence: 99%