Proceedings of the Sixteenth European Conference on Computer Systems 2021
DOI: 10.1145/3447786.3456245
|View full text |Cite
|
Sign up to set email alerts
|

RubberBand

Abstract: Hyperparameter tuning is essential to achieving state-of-theart accuracy in machine learning (ML), but requires substantial compute resources to perform. Existing systems primarily focus on eectively allocating resources for a hyperparameter tuning job under xed resource constraints. We show that the available parallelism in such jobs changes dynamically over the course of execution and, therefore, presents an opportunity to leverage the elasticity of the cloud.In particular, we address the problem of minimizi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(10 citation statements)
references
References 20 publications
0
10
0
Order By: Relevance
“…By repeating and varying the configuration, it is expected that the model can be optimally adjusted so that it can provide accurate predictions on new data. Through the hyperparameter tuning process, the model can be optimised and able to produce better performance than using the default configuration [18].…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…By repeating and varying the configuration, it is expected that the model can be optimally adjusted so that it can provide accurate predictions on new data. Through the hyperparameter tuning process, the model can be optimised and able to produce better performance than using the default configuration [18].…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…Hyperparameter Tuning (i.e., Hyperparameter Optimization, HPO) aims to identify the optimal hyperparameters via massive configuration exploration [112,118]. In the general workflow of an HPO job: (1) the user designates a search space of hyperparameters to explore; (2) the tuning algorithm creates a set of training trials and each trial contains one unique hyperparameter configuration sampled from the search space;…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…Existing research works typically optimize HPO efficiency from the tuning algorithm [132][133][134][135][136][137][138][139][140] or system [7,32,112,118,119,127,141,142] aspects:…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
See 2 more Smart Citations