2015
DOI: 10.1016/j.compchemeng.2015.02.013
|View full text |Cite
|
Sign up to set email alerts
|

Improved Big-M reformulation for generalized disjunctive programs

Abstract: In this work, we present a new Big-M reformulation for Generalized Disjunctive Programs. The proposed MINLP reformulation is stronger than the traditional Big-M, and it does not require additional variables or constraints. We present the newBig-M, and analyze the strength in its continuous relaxation compared to that of the traditional Big-M. The new formulation is tested by solving several instances of process networks and muli-product batch plant problems with an NLP-based branch and bound method. The result… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 57 publications
(28 citation statements)
references
References 23 publications
0
28
0
Order By: Relevance
“…Disjunctive constraints can be linearized using a set of disjunctive parameters. This type of reformulation (also known as big-M technique) was well studied in (Lee and Grossmann 2000) and (Trespalacios and Grossmann 2015). The choice of disjunctive parameters is critical for linear reformulation of disjunctive constraints.…”
Section: The Linearization Of Disjunctive Constraints (6d)mentioning
confidence: 99%
See 1 more Smart Citation
“…Disjunctive constraints can be linearized using a set of disjunctive parameters. This type of reformulation (also known as big-M technique) was well studied in (Lee and Grossmann 2000) and (Trespalacios and Grossmann 2015). The choice of disjunctive parameters is critical for linear reformulation of disjunctive constraints.…”
Section: The Linearization Of Disjunctive Constraints (6d)mentioning
confidence: 99%
“…The literature provides several methodologies for tuning the disjunctive parameter. The methodologies for tuning the disjunctive parameter can be found in (Trespalacios and Grossmann 2015) and (Hooker 2011). These methodologies are proved to provide good approximations of the disjunctive parameters under certain conditions but additional large scale optimization problems need to be solved for each case and optimality of the outcome still cannot be guaranteed.…”
Section: Introductionmentioning
confidence: 99%
“…Note that the relaxation may be improved by selecting unique values of M for each constraint [48]. For a tighter continuous relaxation, the HR formulation found in Appendix B may be used.…”
Section: Solution Strategiesmentioning
confidence: 99%
“…More recently, Pyomo.GDP [46] has emerged as an open-source ecosystem for GDP modeling and development, built on top of the Pyomo algebraic modeling language [47] in Python. As an open-source platform, it has been able to incorporate recent innovations in reformulation strategies [48] and logic-based solution algorithms [22]. Powerful options now exist for formulating and solving GDP models.…”
Section: Introductionmentioning
confidence: 99%
“…The literature provides several methods to tune the big-M parameter. The methodologies for tuning big-M can be found in [35,36]. The methods are proved to provide good approximations of the big-M parameters under certain conditions but additional large scale optimization problems should be solved for each case and the optimality still cannot be guaranteed.…”
Section: Strong Duality Conditionmentioning
confidence: 99%