2015
DOI: 10.1007/s10589-015-9813-x
|View full text |Cite
|
Sign up to set email alerts
|

Clustering-based preconditioning for stochastic programs

Abstract: We present a clustering-based preconditioning strategy for KKT systems arising in stochastic programming within an interior-point framework. The key idea is to perform adaptive clustering of scenarios (inside-the-solver) based on their influence on the problem at hand. This approach thus contrasts with existing (outside-the-solver) approaches that cluster scenarios based on problem data alone. We derive spectral and error properties for the preconditioner and demonstrate that scenario compression rates of up t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
26
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 21 publications
(26 citation statements)
references
References 26 publications
0
26
0
Order By: Relevance
“…This problem is equivalent to P m and, because {d ξ } m ξ=1 is i.i.d., P m is an statistical approximation of P ∞ . In the SP literature problem, (5) is known as a sample average approximation (SAA). We define the solution set of P m as S m ⊆ W. A solution of P m is used to update the targets for the next period w m+1 = (x 0,m+1 , η m+1 ).…”
Section: Retroactive Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…This problem is equivalent to P m and, because {d ξ } m ξ=1 is i.i.d., P m is an statistical approximation of P ∞ . In the SP literature problem, (5) is known as a sample average approximation (SAA). We define the solution set of P m as S m ⊆ W. A solution of P m is used to update the targets for the next period w m+1 = (x 0,m+1 , η m+1 ).…”
Section: Retroactive Optimizationmentioning
confidence: 99%
“…This is, in fact, is also a limitation of statistical approximation schemes for SP. To circumvent this issue, one can use clustering techniques that seek to compress the realization space to maintain a tractable approximation [5]. Such techniques are based on the observation that data realizations tend to be redundant and only a small subset actually impacts the cost.…”
Section: Extensions To Nonlinear Systemsmentioning
confidence: 99%
“…Lubin et al [12] forms the Schur system as a by-product of a sparse factorization and factorizes the Schur system in parallel. Cao et al [13] performs adaptive clustering of scenarios inside-the-solver and forms a sparse compressed representation of the large Karush-Kuhn-Tucker(KKT) system as a preconditioner. The matrix that needs to be factorized in this approach is much smaller than the full-space KKT system and more sparse than the Schur system.…”
Section: Introductionmentioning
confidence: 99%
“…It is worth noting that computing approximate Newton directions by PCG does not destroy the good convergence properties of IPMs, as shown in [20]. There is an extensive literature on the use of PCG within IPMs for other types of problems (e.g., [3,4,9,18,25,28] to mention a few).…”
mentioning
confidence: 98%
“…It is worth noting that computing approximate Newton directions by PCG does not destroy the good convergence properties of IPMs, as shown in [20]. There is an extensive literature on the use of PCG within IPMs for other types of problems (e.g., [3,4,9,18,25,28] to mention a few).However, it is not yet clear why for some classes of block-angular problems the above approach may be very efficient (see, for instance, the results of [13,12]), while it may need a large number of PCG iterations in others. The spectral radius may be used to monitor the good or bad behaviour, but an ex-ante explanation has to be found in the structural information of the block-angular constraints matrix.…”
mentioning
confidence: 99%