2021
DOI: 10.48550/arxiv.2111.14991
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Bayesian Optimization for auto-tuning GPU kernels

Abstract: Finding optimal parameter configurations for tunable GPU kernels is a non-trivial exercise for large search spaces, even when automated. This poses an optimization task on a nonconvex search space, using an expensive to evaluate function with unknown derivative. These characteristics make a good candidate for Bayesian Optimization, which has not been applied to this problem before.However, the application of Bayesian Optimization to this problem is challenging. We demonstrate how to deal with the rough, discre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 29 publications
(39 reference statements)
0
2
0
Order By: Relevance
“…They can provide levels of uncertainty for predictions. However, Gaussian processes are computationally intensive and may not scale well to large datasets, and they offer a probabilistic interpretation [96,97]. Multivariate adaptive regression splines are non-linear models that can capture complex relationships between variables and they are useful when the dataset is large and computation time is not an issue [98].…”
Section: Comparison Of Algorithmsmentioning
confidence: 99%
“…They can provide levels of uncertainty for predictions. However, Gaussian processes are computationally intensive and may not scale well to large datasets, and they offer a probabilistic interpretation [96,97]. Multivariate adaptive regression splines are non-linear models that can capture complex relationships between variables and they are useful when the dataset is large and computation time is not an issue [98].…”
Section: Comparison Of Algorithmsmentioning
confidence: 99%
“…The first line involves the development of new acquisition functions in the GP surrogate model. Certain designs of acquisition functions aim to solve specific problems encountered in discrete or categorical space (Deshwal et al, 2021;Oh et al, 2021;Willemsen et al, 2021). The advantage of this line of methods is that the BO framework is intact and the GP surrogate is preserved.…”
Section: Literature Reviewmentioning
confidence: 99%