Proceedings of the 4th ACM SIGSOFT International Workshop on Software Analytics 2018
DOI: 10.1145/3278142.3278145
|View full text |Cite
|
Sign up to set email alerts
|

Is one hyperparameter optimizer enough?

Abstract: Hyperparameter tuning is the black art of automatically finding a good combination of control parameters for a data miner. While widely applied in empirical Software Engineering, there has not been much discussion on which hyperparameter tuner is best for software analytics. To address this gap in the literature, this paper applied a range of hyperparameter optimizers (grid search, random search, differential evolution, and Bayesian optimization) to a defect prediction problem. Surprisingly, no hyperparameter … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 41 publications
0
5
0
Order By: Relevance
“…learning on static code attributes such as C.K. and McGabe metrics) [1], [3], [18], [20], [33], [36], [44], [57], [64], [65], [72], [74], [80], [96] that are more granulated and high-dimensional.…”
Section: Future Workmentioning
confidence: 99%
“…learning on static code attributes such as C.K. and McGabe metrics) [1], [3], [18], [20], [33], [36], [44], [57], [64], [65], [72], [74], [80], [96] that are more granulated and high-dimensional.…”
Section: Future Workmentioning
confidence: 99%
“…The problem with hyperparameter optimization is finding enough CPU. The cost of running a data miner through all those options is very high, requiring days to weeks to decades of CPU [56], [57], [58], [59], [62], [64]. For many years, we have addressed these long CPU times via cloudbased CPU farms.…”
Section: Introductionmentioning
confidence: 99%
“…-Some initial results suggest it may not be enough to just use AutoML [97] (in summary, given N hyperparameter optimizers, AutoML was "best" for a miniority of goals and datasets).…”
Section: Research Directionmentioning
confidence: 99%
“…-Some initial results suggest it may not be enough to just use AutoML [97] (in summary, given N hyperparameter optimizers, AutoML was "best" for a miniority of goals and datasets). -In terms of training ourselves on how easy it is to combine optimizers and data miners, there is no substitute for "rolling your own" implementation (at least once, then perhaps moving on to tools like AutoML).…”
Section: Research Directionmentioning
confidence: 99%