2015
DOI: 10.1287/ijoc.2014.0610
|View full text |Cite
|
Sign up to set email alerts
|

Algorithmic Approach for Improved Mixed-Integer Reformulations of Convex Generalized Disjunctive Programs

Abstract: In this work, we propose an algorithmic approach to improve mixed-integer models that are originally formulated as convex Generalized Disjunctive Programs (GDP). The algorithm seeks to obtain an improved continuous relaxation of the MILP/MINLP reformulation of the GDP, while limiting the growth in the problem size. There are three main stages that form the basis of the algorithm. The first one is a pre-solve, consequence of the logic nature of GDP, which allows us to reduce the problem size, find good relaxati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
9
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 33 publications
1
9
0
Order By: Relevance
“…We note that the general concept of this reformulation was presented before [26]. More recently, a similar reformulation was presented for linear problems in the context of MILP formulation techniques [27].…”
Section: New Big-m Reformulationmentioning
confidence: 89%
“…We note that the general concept of this reformulation was presented before [26]. More recently, a similar reformulation was presented for linear problems in the context of MILP formulation techniques [27].…”
Section: New Big-m Reformulationmentioning
confidence: 89%
“…There has been work suggesting which basic steps to apply [55,81,82], and to algorithmically Figure 5: Illustration of (HR) (a) before, and (b) after the application of a basic step improve formulations while limiting the problem size growth [83].…”
Section: Improving Bound Tightening Through Basic Stepsmentioning
confidence: 99%
“…The test instances are 19 benchmark MINLP models from Trespalacios and Grossmann [31] and minlplib [32] .…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…For example, when solving the convex MINLP problem CLAY55M with DICOPT [31] , the number of iterations and CPU time can be very large as shown in Figure 1, due to the fact that there are many infeasible subproblems.…”
mentioning
confidence: 99%