2015
DOI: 10.1016/j.asoc.2015.06.012
|View full text |Cite
|
Sign up to set email alerts
|

GA-PARSIMONY: A GA-SVR approach with feature selection and parameter optimization to obtain parsimonious solutions for predicting temperature settings in a continuous annealing furnace

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 47 publications
(6 citation statements)
references
References 34 publications
0
4
0
Order By: Relevance
“…In order to select the best structure of the forecasting model, an optimization methodology was used based on the genetic algorithm (GA) with advanced generalization capabilities. This methodology is the GA-PARSIMONY [71], which allows the selection of parsimonious models. The main difference of this methodology with respect to the conventional GAs is a rearrange in the ranking of the individuals based on their complexities, so that individuals with less complexity (in this case, models with a less complex structure) are promoted to the best position of each generation.…”
Section: Prediction Results For the Photovoltaic Generationmentioning
confidence: 99%
“…In order to select the best structure of the forecasting model, an optimization methodology was used based on the genetic algorithm (GA) with advanced generalization capabilities. This methodology is the GA-PARSIMONY [71], which allows the selection of parsimonious models. The main difference of this methodology with respect to the conventional GAs is a rearrange in the ranking of the individuals based on their complexities, so that individuals with less complexity (in this case, models with a less complex structure) are promoted to the best position of each generation.…”
Section: Prediction Results For the Photovoltaic Generationmentioning
confidence: 99%
“…In recent years, there have been many methods for optimizing the model parameters of the SVR algorithm (Sanz‐Garcia et al, 2015). Typically, gradient descent or grid algorithms have been used, but these traditional optimization methods are too slow in the training process, and it is easy to return to the local optimal solution (Moshkbar‐Bakhshayesh, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…First, a GP (with UCB as an acquisition function) is used to obtain the best HPO setting (according to the RMSE), considering the full set of features. Next, a variant of GA (GA-PARSIMONY, Sanz-García et al 2015) is used to select the best features of the problem, given the hyperparameters obtained in the first step. In this way, the final model has high accuracy and lower complexity (i.e., fewer features), and optimization time is significantly reduced.…”
Section: Hybrid Hpo Algorithmsmentioning
confidence: 99%