2023
DOI: 10.1016/j.clet.2023.100664
|View full text |Cite
|
Sign up to set email alerts
|

The usage of 10-fold cross-validation and grid search to enhance ML methods performance in solar farm power generation prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 32 publications
(7 citation statements)
references
References 27 publications
0
7
0
Order By: Relevance
“…In this study, Bayesian optimization was chosen due to its effectiveness in handling non-linear and complex search spaces. Unlike traditional grid search or cross-validation methods [72], Bayesian optimization uses probabilistic models to predict the performance of different hyperparameter configurations, guiding the search toward promising regions [73]. This is particularly beneficial in high-dimensional spaces, where an exhaustive search becomes computationally expensive.…”
Section: Hyperparameter Optimizationmentioning
confidence: 99%
“…In this study, Bayesian optimization was chosen due to its effectiveness in handling non-linear and complex search spaces. Unlike traditional grid search or cross-validation methods [72], Bayesian optimization uses probabilistic models to predict the performance of different hyperparameter configurations, guiding the search toward promising regions [73]. This is particularly beneficial in high-dimensional spaces, where an exhaustive search becomes computationally expensive.…”
Section: Hyperparameter Optimizationmentioning
confidence: 99%
“…Hyperparameter tuning is conducted using randomized search cross-validation (RandomizedSearchCV) [38]. We sample 50 parameter settings (n_iter = 50) to ensure a balanced exploration of hyperparameter configurations.…”
Section: Model Training and Evaluationmentioning
confidence: 99%
“…Techniques and tools like grid search, random search, Bayesian optimization, evolutionary algorithms, and others can automatically explore the hyperparameter space to find optimal configurations. Tools that visualize changes (such as Matplotlib, Seaborn, Plotly, Weights & Biases, TensorBoard, Mlflow, Scikit-learn) [39], in the learning process and the impact of hyperparameters can help understand how specific settings affect the model [40].…”
Section: Hyperparameter Tuningmentioning
confidence: 99%