2020
DOI: 10.1016/j.neucom.2019.11.004
|View full text |Cite
|
Sign up to set email alerts
|

Dealing with categorical and integer-valued variables in Bayesian Optimization with Gaussian processes

Abstract: Bayesian Optimization (BO) is useful for optimizing functions that are expensive to evaluate, lack an analytical expression and whose evaluations can be contaminated by noise. These methods rely on a probabilistic model of the objective function, typically a Gaussian process (GP), upon which an acquisition function is built. The acquisition function guides the optimization process and measures the expected utility of performing an evaluation of the objective at a new point. GPs assume continuous input variable… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
100
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 159 publications
(110 citation statements)
references
References 11 publications
2
100
0
Order By: Relevance
“…Integers and categorical hyperparameters require special treatment but can be integrated fairly easily into regular Bayesian optimization by small adaptations of the kernel and the optimization procedure (see Sect. 12.1.2 of [58], as well as [42]). Other models, such as factorization machines and random forests, can also naturally handle these data types.…”
Section: Configuration Space Descriptionmentioning
confidence: 99%
“…Integers and categorical hyperparameters require special treatment but can be integrated fairly easily into regular Bayesian optimization by small adaptations of the kernel and the optimization procedure (see Sect. 12.1.2 of [58], as well as [42]). Other models, such as factorization machines and random forests, can also naturally handle these data types.…”
Section: Configuration Space Descriptionmentioning
confidence: 99%
“…These methods use probability theory to determine the most promising candidate point according to the surrogate model. However, when the variables are discrete, the typical approach is to relax the integer constraints, which often leads to suboptimal solutions [7]. The authors in [7] tackled this problem by modifying the covariance function used in the surrogate model.…”
Section: Problem Description and Related Workmentioning
confidence: 99%
“…However, it is still an on-going research question on how these techniques can be applied effectively to combinatorial optimization problems. A common approach is to simply round to the nearest integer, a method that is known to be sub-optimal in traditional optimization, and also in black-box optimization [7]. Another option is to use discrete surrogate models from machine learning such as regression trees [8] or linear model trees [9].…”
Section: Introductionmentioning
confidence: 99%
“…Thus, this problem involves continuous and discrete variables in the optimization task whereas classical BO assumes continuous variables only. To overcome this restriction, a modified version of BO is used where the kernel function is transformed in a way such that integer-valued inputs are properly included [25]. Based on previous evaluations of the cost function, BO updates the prior and minimizes the acquisition function.…”
Section: Model Optimizationmentioning
confidence: 99%