2016 IEEE Congress on Evolutionary Computation (CEC) 2016
DOI: 10.1109/cec.2016.7744347
|View full text |Cite
|
Sign up to set email alerts
|

Learning and exploiting mixed variable dependencies with a model-based EA

Abstract: Abstract-Mixed-integer optimization considers problems with both discrete and continuous variables. The ability to learn and process problem structure can be of paramount importance for optimization, particularly when faced with black-box optimization (BBO) problems, where no structural knowledge is known a priori. For such cases, model-based Evolutionary Algorithms (EAs) have been very successful in the fields of discrete and continuous optimization. In this paper, we present a model-based EA which integrates… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 10 publications
(13 reference statements)
0
6
0
Order By: Relevance
“…Here, we give a brief outline of GAMBIT. More details can be found in literature [12]. GAMBIT is a parameter-free model-based EA capable of learning and exploiting di erent types of variable dependencies, through a clustering mechanism and an integrated dependency-models mechanism.…”
Section: Gambitmentioning
confidence: 99%
See 2 more Smart Citations
“…Here, we give a brief outline of GAMBIT. More details can be found in literature [12]. GAMBIT is a parameter-free model-based EA capable of learning and exploiting di erent types of variable dependencies, through a clustering mechanism and an integrated dependency-models mechanism.…”
Section: Gambitmentioning
confidence: 99%
“…e same number of continuous subsets are added to the FOS, each containing every continuous problem variable. Additionally, l c + l d mixed subsets are added to the FOS by building another linkage tree constrained to merge discrete and continuous variables using a mixed mutual information metric, described in detail in [12]. Such mixed subsets allow for the consideration of discrete and continuous variables together, resulting in the ability to exploit potential mixed variable dependencies.…”
Section: Gambitmentioning
confidence: 99%
See 1 more Smart Citation
“…Optimization where discrete and continuous variables are present simultaneously is explored relatively less and is referred to as mixed-integer optimization. A recently introduced Genetic Algorithm for Model-Based mixed-Integer opTimization (GAMBIT) has shown to be an effective approach to single-objective optimization in the mixed-integer domain [75], especially in the case of black-box optimization, meaning that no internal structure of the problem is assumed to be known in advance. This makes GAMBIT flexible and easily adaptable for our multiobjective approach of BT pre-planning optimization.…”
Section: Colsmentioning
confidence: 99%
“…The same number of continuous subsets are added to the FOS, each containing every continuous problem variable. Additionally, l c + l d mixed subsets are added to the FOS by building another linkage tree constrained to merge discrete and continuous variables using a mixed mutual information metric, described in detail in [75]. Such mixed subsets allow for the consideration of discrete and continuous variables together, resulting in the ability to exploit potential mixed variable dependencies.…”
Section: Gambitmentioning
confidence: 99%