2022
DOI: 10.14569/ijacsa.2022.0130666
|View full text |Cite
|
Sign up to set email alerts
|

Application of Optimized SVM in Sample Classification

Xuemei Yao

Abstract: Support vector machines (SVM) have unique advantages in solving problems with small samples, nonlinearity and high dimension. It has a relatively complete theory and has been widely used in various fields. The classification accuracy and generalization ability of SVMs are determined by the selected parameters, for which there is no solid theoretical guidance. To address this parameter optimization problem, we applied random selection, genetic algorithms (GA), particle swarm optimization (PSO) and K-fold cross … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…For example, SVM exhibits significant advantages in addressing small sample problems. It maintains robust classification performance even with limited sample sizes by constructing an optimal hyperplane to separate different data categories ( Yao, 2022 ). RF enhances prediction accuracy and robustness by integrating multiple decision trees ( Momade et al, 2020 ), while XGBoost supports various objective functions and evaluation metrics, enabling outstanding performance in diverse prediction tasks ( Sagi & Rokach, 2021 ).…”
Section: Methodsmentioning
confidence: 99%
“…For example, SVM exhibits significant advantages in addressing small sample problems. It maintains robust classification performance even with limited sample sizes by constructing an optimal hyperplane to separate different data categories ( Yao, 2022 ). RF enhances prediction accuracy and robustness by integrating multiple decision trees ( Momade et al, 2020 ), while XGBoost supports various objective functions and evaluation metrics, enabling outstanding performance in diverse prediction tasks ( Sagi & Rokach, 2021 ).…”
Section: Methodsmentioning
confidence: 99%
“…During the training and validation processes of our SVR model, penalty coefficient c and kernel function parameter g were determined using the K-fold cross-validation (K-CV) method to identify the optimal combination [57][58][59]. The K-value was equivalent to the number of groups in the partitioned training set.…”
Section: Support Vector Regression (Svr)mentioning
confidence: 99%
“…On the other hand, whenever the data set is larger, it requires a lot of computational time. [16] 5.2 Naive Bayes(NB) [17] The NB, a widely used method, mathematically adheres to Bayes' theorem. The eq.14 shows the probability calculation using Bayes theorem.…”
Section: Support Vector Machine(svm)[15]mentioning
confidence: 99%