2012 IEEE 14th International Conference on Communication Technology 2012
DOI: 10.1109/icct.2012.6511415
|View full text |Cite
|
Sign up to set email alerts
|

An improved grid search algorithm of SVR parameters optimization

Abstract: The proper selection of parameters, kernel parameter g, penalty factor c, non-sensitive coefficient p of Support Vector Regression (SVR) model can optimize SVR's performance. The most commonly used approach is grid search. However, when the data set is large, a terribly long time will be introduced. Thus, we propose an improved grid algorithm to reduce searching time by reduce the number of doing cross-validation test. Firstly, the penalty factor c could be calculated by an empirical formula. Then the best ker… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 40 publications
(17 citation statements)
references
References 6 publications
0
10
0
Order By: Relevance
“…In our approach, a grid search with fivefold cross-validation is employed for tuning parameters, iterating over many possible parameter combinations to maximize the classification accuracy (ACC). Grid search is one of the most widely used techniques and allows us to have a transparent parameter selection (Huang et al, 2012).…”
Section: Discussionmentioning
confidence: 99%
“…In our approach, a grid search with fivefold cross-validation is employed for tuning parameters, iterating over many possible parameter combinations to maximize the classification accuracy (ACC). Grid search is one of the most widely used techniques and allows us to have a transparent parameter selection (Huang et al, 2012).…”
Section: Discussionmentioning
confidence: 99%
“…To determine the optimal hyperparameters for a machine learning algorithm, one commonly used method is grid search, which involves training the model with different combinations of hyperparameters and selecting the combination that results in the best performance on a validation dataset [42][43][44][45]. Another method is random search, which involves randomly sampling hyperparameter combinations and selecting the combination that results in the best performance on a validation dataset.…”
Section: 4grid Search Methodsmentioning
confidence: 99%
“…Grid search is an automatic hyperparameter optimization algorithm, which is widely used in hyperparameter optimization for deep learning [55][56][57]. In grid search algorithm, each hyperparameter is first discretized to generate a discrete search space.…”
Section: Research On Models' Hyperparameter Optimizationmentioning
confidence: 99%