2020
DOI: 10.48550/arxiv.2007.15745
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice

Li Yang,
Abdallah Shami

Abstract: Machine learning algorithms have been used widely in various applications and areas. To fit a machine learning model into different problems, its hyperparameters must be tuned. Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model's performance. It often requires deep knowledge of machine learning algorithms and appropriate hyper-parameter optimization techniques. Although several automatic optimization techniques exist, they have different strengths and … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 85 publications
0
4
0
Order By: Relevance
“…It is important to note that in the larger body of research that employs neural networks as the primary algorithm, there is a lack of standardization regarding the design. Furthermore, the vast majority of research focused on neural network design is more focused on hyperparameter optimization via algorithms rather than establishing empirical standards that researchers can utilize as proven baselines to effectively compare and further develop different neural networks (Yang & Shami, 2020). In this section, we will go over how each neural network design decision was made.…”
Section: Neural Network Designmentioning
confidence: 99%
“…It is important to note that in the larger body of research that employs neural networks as the primary algorithm, there is a lack of standardization regarding the design. Furthermore, the vast majority of research focused on neural network design is more focused on hyperparameter optimization via algorithms rather than establishing empirical standards that researchers can utilize as proven baselines to effectively compare and further develop different neural networks (Yang & Shami, 2020). In this section, we will go over how each neural network design decision was made.…”
Section: Neural Network Designmentioning
confidence: 99%
“…Developers often optimize ANN hyperparameters by experimenting with a range of heuristic values. Hyperparameter optimization algorithms [1335][1336][1337][1338][1339][1340] can automate optimizer hyperparameter selection. However, automatic hyperparameter optimizers may not yield sufficient performance improvements relative to well-established heuristics to justify their use, especially in initial stages of development.…”
Section: /98mentioning
confidence: 99%
“…The secondary goal is to determine which factors significantly influence this. Model hyperparameters [7] are investigated first, as the relationship between these and training time is often obvious. We aim not only to identify them but to model their influence on training time.…”
Section: Introductionmentioning
confidence: 99%