2017 International Conference on Networks &Amp; Advances in Computational Technologies (NetACT) 2017
DOI: 10.1109/netact.2017.8076805
|View full text |Cite
|
Sign up to set email alerts
|

An efficient feature selection using artificial fish swarm optimization and svm classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…The experiment results showed that the proposed method achieves good performance in finding the best subset of features. Nalluri et al developed an AFSA algorithm with SVM that finds the most valuable subset of features in a dataset [123]. They evaluated their approach on datasets having binary-and multi-labelled classes.…”
Section: Artificial Fish Swarm Algorithm (Afsa)mentioning
confidence: 99%
“…The experiment results showed that the proposed method achieves good performance in finding the best subset of features. Nalluri et al developed an AFSA algorithm with SVM that finds the most valuable subset of features in a dataset [123]. They evaluated their approach on datasets having binary-and multi-labelled classes.…”
Section: Artificial Fish Swarm Algorithm (Afsa)mentioning
confidence: 99%
“…Different types of machine learning represent different classification algorithms, and each has its own advantages. The KNN algorithm has high classification accuracy and is insensitive to abnormal samples [41,42]; SVM has the advantage of being able to handle small data samples, as well as having global optimality and a strong generalization ability [43]; RF uses the average of many decision trees in the classification process to minimize overfitting and produce good regression results, thus improving the classification accuracy [44]; GP is widely used in time-series prediction tasks because of its ability to effectively exploit correlations between features [45]; XGBoost can adjust hyperparameters to maximize model performance and prevent model overfitting, while using a gradient descent optimization algorithm to integrate decision trees sequentially to minimize model error [46,47].…”
Section: Machine Learning Modelsmentioning
confidence: 99%
“…Several swarm intelligence algorithms have been used in the attributes selection [9][10][11][12][13][14][15][16][17]. Unfortunately, no single stable strategy exists to reduce the burden of computing and extracting highly correlated risk factors to the data and improve the classifier performance and achieve high accuracy.…”
Section: Bat Algorithm For Feature Selectionmentioning
confidence: 99%