2023
DOI: 10.1016/j.iswa.2023.200248
|View full text |Cite
|
Sign up to set email alerts
|

Babysitting hyperparameter optimization and 10-fold-cross-validation to enhance the performance of ML methods in predicting wind speed and energy generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 34 publications
0
6
0
Order By: Relevance
“…It provided each data the opportunity to become test data. K in this context referred to the number of multiples used for the division between training data and test data, this study uses K=10 [29], [30].…”
Section: Discussionmentioning
confidence: 99%
“…It provided each data the opportunity to become test data. K in this context referred to the number of multiples used for the division between training data and test data, this study uses K=10 [29], [30].…”
Section: Discussionmentioning
confidence: 99%
“…In other words, all data is utilized to train the ML models after 10 iterations -lowering the bias. What's more, every iteration's model weights for the convolutional layers are continuously updated, which also adds the effectiveness of training (Malakouti, 2023).…”
Section: Semantic Segmentationmentioning
confidence: 99%
“…The hyperparameter settings were a reasonable selection, tuning, and combination based on considering the prediction accuracy and generalization of the above machine learning models. To enhance the predictive ability of machine learning algorithms, a 10-fold cross-validation method was used to evaluate the models [45]. Flow chart of the model training and evaluation process is shown in Figure 3, and the characteristics and specific implementation settings of each algorithm are introduced as follows.…”
Section: Modelingmentioning
confidence: 99%