2017
DOI: 10.1007/s12532-017-0130-5
|View full text |Cite
|
Sign up to set email alerts
|

Parallelizing the dual revised simplex method

Abstract: This paper introduces the design and implementation of two parallel dual simplex solvers for general large scale sparse linear programming problems. One approach, called PAMI, extends a relatively unknown pivoting strategy called suboptimization and exploits parallelism across multiple iterations. The other, called SIP, exploits purely single iteration parallelism by overlapping computational components when possible. Computational results show that the performance of PAMI is superior to that of the leading op… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
52
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 127 publications
(67 citation statements)
references
References 16 publications
0
52
0
1
Order By: Relevance
“…Two other valuable attempts are presented in [26][27] following the primal-dual simplex method and the sparse simplex method, and they've led to satisfactory results for large scale problems. Till recently, no other valuable attempts have been made to parallelize the classical revised simplex method, thus making the one presented by Huangfu and Hall [28][29] a distinguished one. The authors in [28][29] have designed and implemented a very efficient parallelization scheme of the dual revised method with use of the suboptimization technique, and they have obtained speedup values comparable to those of the best commercial simplex solvers.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Two other valuable attempts are presented in [26][27] following the primal-dual simplex method and the sparse simplex method, and they've led to satisfactory results for large scale problems. Till recently, no other valuable attempts have been made to parallelize the classical revised simplex method, thus making the one presented by Huangfu and Hall [28][29] a distinguished one. The authors in [28][29] have designed and implemented a very efficient parallelization scheme of the dual revised method with use of the suboptimization technique, and they have obtained speedup values comparable to those of the best commercial simplex solvers.…”
Section: Related Workmentioning
confidence: 99%
“…Till recently, no other valuable attempts have been made to parallelize the classical revised simplex method, thus making the one presented by Huangfu and Hall [28][29] a distinguished one. The authors in [28][29] have designed and implemented a very efficient parallelization scheme of the dual revised method with use of the suboptimization technique, and they have obtained speedup values comparable to those of the best commercial simplex solvers. A relevant survey which covers adequately all the recent advances in simplex parallelization can be found in [12].…”
Section: Related Workmentioning
confidence: 99%
“…If this linear program problem is found to be unbounded, then we know that there must exists some x satisfying the conditions of a SDAS. We used the "highs" method (Huangfu and Hall, 2018) to confirm the existence of SDASs and the "revised simplex" method (Bertsimas and Tsitsiklis, 1997) to enumerate all reactions of a SDAS. Once a SDAS was confirmed to exist, we ran the integer programming process that is described in the next section for finding the autocatalytic cores subject to further specific constraints within the SDAS.…”
Section: Detecting Seed-dependent Autocatalytic Systems (Sdass) By Linear Programmingmentioning
confidence: 99%
“…The effectiveness of the Idiot crash is assessed via experiments with Clp (Version 1.16.10), using a set of 30 representative LP test problems in Table 1. This is the set used by Huangfu and Hall in [9], with qap15 replacing dcp2 due to QAP problems being of particular interest and the latter not being a public test problem, and nug15 replacing nug12 for consistency with the choice of QAP problems used by Mittelmann [11]. The three problems nug15, qap12 and qap15 are linearizations of quadratic assignment problems, where nug15 and qap15 differ only via row and column permutations.…”
Section: Preliminary Experimentsmentioning
confidence: 99%