2021
DOI: 10.3390/electronics10161973
|View full text |Cite
|
Sign up to set email alerts
|

Greed Is Good: Rapid Hyperparameter Optimization and Model Selection Using Greedy k-Fold Cross Validation

Abstract: Selecting a final machine learning (ML) model typically occurs after a process of hyperparameter optimization in which many candidate models with varying structural properties and algorithmic settings are evaluated and compared. Evaluating each candidate model commonly relies on k-fold cross validation, wherein the data are randomly subdivided into k folds, with each fold being iteratively used as a validation set for a model that has been trained using the remaining folds. While many research studies have sou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 33 publications
(18 citation statements)
references
References 26 publications
(35 reference statements)
0
11
0
Order By: Relevance
“…The Stochastic Gradient Descent (SGD) has been used to train the parameters of all the proposed models and a greedy k-fold cross validation approach [31] has been used to estimate the hyper-parameters of the optimiser (see Table I).…”
Section: Resultsmentioning
confidence: 99%
“…The Stochastic Gradient Descent (SGD) has been used to train the parameters of all the proposed models and a greedy k-fold cross validation approach [31] has been used to estimate the hyper-parameters of the optimiser (see Table I).…”
Section: Resultsmentioning
confidence: 99%
“…During iteration, each fold is used to train the k − 1 candidate model and the performance of the model is measured with the remaining fold. is process is repeated until each fold is fully used as a validation set, and a total of k retraining and validation processes are performed for each candidate model [43][44][45][46]. Figure 3 shows the k 10 fold cross-validation process used in the study.…”
Section: K-fold Cross Validationmentioning
confidence: 99%
“…Therefore, it is critical to obtain the hyper-parameters combination with the best generalization performance. In this study, we use the K-Fold cross-validation method (Soper, 2021) to evaluate the model performance. However, the cross-validation results of ML models are sensitive to the choice of hyper-parameters.…”
Section: Hyper-parameters Optimization For Regression Modelsmentioning
confidence: 99%