2023
DOI: 10.1038/s41598-023-32027-3
|View full text |Cite
|
Sign up to set email alerts
|

An improved hyperparameter optimization framework for AutoML systems using evolutionary algorithms

Abstract: For any machine learning model, finding the optimal hyperparameter setting has a direct and significant impact on the model’s performance. In this paper, we discuss different types of hyperparameter optimization techniques. We compare the performance of some of the hyperparameter optimization techniques on image classification datasets with the help of AutoML models. In particular, the paper studies Bayesian optimization in depth and proposes the use of genetic algorithm, differential evolution and covariance … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(3 citation statements)
references
References 61 publications
0
3
0
Order By: Relevance
“…As one of the most frequent evolutionary algorithms, GA utilizes selection, crossover, and mutation operations to solve the global optimum [40]. It usually adopts a random initialization to generate the original population for the potential configuration of hyperparameters [41], requiring less prior knowledge about initial hyperparameter configuration. After continuous evolutions, the individual with the best fitness will be determined as the globally optimal solution.…”
Section: Hyperparameter Tuning Of Deep Modelsmentioning
confidence: 99%
“…As one of the most frequent evolutionary algorithms, GA utilizes selection, crossover, and mutation operations to solve the global optimum [40]. It usually adopts a random initialization to generate the original population for the potential configuration of hyperparameters [41], requiring less prior knowledge about initial hyperparameter configuration. After continuous evolutions, the individual with the best fitness will be determined as the globally optimal solution.…”
Section: Hyperparameter Tuning Of Deep Modelsmentioning
confidence: 99%
“…The gait data were separated into training and testing sets based on the subjects for the traditional machine learning models in a similar manner as described above for the LSTM approach. The machine learning models were tuned using a Bayesian algorithm and 5fold Cross-validation [53]- [55]. For missing values imputation, we grouped the gait data by subject and assigned the mean of the gait feature to the missing value [56].…”
Section: Standard Machine Learning Models Using the Extracted Gait Ch...mentioning
confidence: 99%
“…Achieving an optimal model performance necessitates hyperparameter tuning, which typically involves multiple rounds of model training and validation, further compounding computational intricacy. However, their forward propagation (i.e., predictions) is typically swift once these models are adequately trained [58,59]. Vegetation ecological process models, on the other hand, are rooted in explicit physiological and environmental processes, typically involving a set of differential equations.…”
Section: Integrating Machine Learning and Ecological Process Models F...mentioning
confidence: 99%