2022
DOI: 10.46604/aiti.2022.9227
|View full text |Cite
|
Sign up to set email alerts
|

Effects of Data Standardization on Hyperparameter Optimization with the Grid Search Algorithm Based on Deep Learning: A Case Study of Electric Load Forecasting

Abstract: This study investigates data standardization methods based on the grid search (GS) algorithm for energy load forecasting, including zero-mean, min-max, max, decimal, sigmoid, softmax, median, and robust, to determine the hyperparameters of deep learning (DL) models. The considered DL models are the convolutional neural network (CNN) and long short-term memory network (LSTMN). The procedure is made over (i) setting the configuration for CNN and LSTMN, (ii) establishing the hyperparameter values of CNN and LSTMN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 23 publications
(21 reference statements)
0
0
0
Order By: Relevance
“…For instance, the popular swarm-based particle swarm optimization (PSO) is weakened in the local explanation and easily falls into the local optimum with a low convergence rate, resulting in a low precision or even failure [59][60][61]. The grid search method is one of the most common and also a direct way of efficiently tuning the model hyperparameters and has been successfully developed for tuning the hyperparameters of the deep learning models [62][63][64]. Therefore, the grid search approach is developed to tune the optimal hyperparameters of the models in this work.…”
Section: Hyperparameters Tuning Based On the Grid Search Approachmentioning
confidence: 99%
“…For instance, the popular swarm-based particle swarm optimization (PSO) is weakened in the local explanation and easily falls into the local optimum with a low convergence rate, resulting in a low precision or even failure [59][60][61]. The grid search method is one of the most common and also a direct way of efficiently tuning the model hyperparameters and has been successfully developed for tuning the hyperparameters of the deep learning models [62][63][64]. Therefore, the grid search approach is developed to tune the optimal hyperparameters of the models in this work.…”
Section: Hyperparameters Tuning Based On the Grid Search Approachmentioning
confidence: 99%
“…More and more scholars have found that meta-heuristic algorithms, due to their simplicity, efficiency, and wide applicability, show significant advantages in solving combinatorial optimization problems such as the MTSP [17][18] and the JSP [19][20], which provide the feasibility and effectiveness of the efficient solution of such complex problems. In terms of algorithm design, scholars have proposed hybrid meta-heuristic algorithms combining techniques such as the dragonfly algorithm (DA), firefly algorithm (FA), and genetic algorithm (GA) [21][22]. Compared with the traditional meta-heuristic algorithms, the hybrid meta-heuristic algorithms have strong adaptive and global search ability and can find more appropriate solutions in the search space of complex problems.…”
Section: Introductionmentioning
confidence: 99%