2016
DOI: 10.1016/j.ejor.2015.12.018
|View full text |Cite
|
Sign up to set email alerts
|

Global optimization advances in Mixed-Integer Nonlinear Programming, MINLP, and Constrained Derivative-Free Optimization, CDFO

Abstract: This manuscript reviews recent advances in deterministic global optimization for Mixed-Integer Nonlinear Programming (MINLP), as well as Constrained DerivativeFree Optimization (CDFO). This work provides a comprehensive and detailed literature review in terms of significant theoretical contributions, algorithmic developments, software implementations and applications for both MINLP and CDFO. Both research areas have experienced rapid growth, with a common aim to solve a wide range of real-world problems. We sh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
95
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 193 publications
(107 citation statements)
references
References 445 publications
0
95
0
Order By: Relevance
“…38 Flash for Reduced-Space Global Optimization. A fully equation-oriented flash model like (1)(2)(3)(4)(5)(6)(7)(8)(9), where (6-9) are replaced with suitable submodels, can be directly included in formulation (FS) for global optimization by considering all equations of the flash model as equality constraints and all variables occurring therein as optimization variables.…”
Section: Basic Approachmentioning
confidence: 99%
See 2 more Smart Citations
“…38 Flash for Reduced-Space Global Optimization. A fully equation-oriented flash model like (1)(2)(3)(4)(5)(6)(7)(8)(9), where (6-9) are replaced with suitable submodels, can be directly included in formulation (FS) for global optimization by considering all equations of the flash model as equality constraints and all variables occurring therein as optimization variables.…”
Section: Basic Approachmentioning
confidence: 99%
“…For a reduced-space flash model to be used in (RS), we instead seek to construct a function that contains as much as possible of the model variables and equations, thus moving them out of the optimization problem, while still admitting explicit evaluation. The extent to which such an elimination of optimization variables is possible depends on the submodels in (6)(7)(8)(9).…”
Section: Basic Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, the approximation of a non-linear neural network can be in some cases very appealing and fitting to the data set at hand but the purpose of the approximation is defeated as the complexity reduction compared to the original model is very little. Similarly for gray and black box approaches mentioned earlier [49,50]. Given the application of receding horizon multi-parametric programming based techniques that follow the approximation step, the model structure choice becomes very limited.…”
Section: B24 Choosing the Model Structurementioning
confidence: 99%
“…New techniques for MINLP can be found in [1,2,3,4,5]. The recent work presented in [6] is a detailed literature review and contains theoretical contributions, algorithmic developments, software implementations and applications for MINLP. The area of nonconvex MINLP has deserved a lot of attention from researchers working with heuristics, like genetic algorithm [7], ant colony [8], evolutionary algorithms [9], pattern search algorithms [10], multistart Hooke-and-Jeeves algorithm [11,12] and differential evolution [13].…”
Section: Introductionmentioning
confidence: 99%