2015
DOI: 10.1155/2015/806954
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Using Particle Swarm Optimization in Intrusion Detection

Abstract: The prevention of intrusion in networks is decisive and an intrusion detection system is extremely desirable with potent intrusion detection mechanism. Excessive work is done on intrusion detection systems but still these are not powerful due to high number of false alarms. One of the leading causes of false alarms is due to the usage of a raw dataset that contains redundancy. To resolve this issue, feature selection is necessary which can improve intrusion detection performance. Latterly, principal component … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0
1

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 53 publications
(36 citation statements)
references
References 19 publications
0
35
0
1
Order By: Relevance
“…DAR Ensemble [52] 78.88 N/A Naive Bayes-KNN-CF [53] 82.00 05.43 Feature Selection + SVM [54] 82.37 15.00 GAR Forest + Symmatrixal Uncertainity [55] 85.00 12.20 Bagging j48 [56] 84.25 02.79 PCA+PSO [57] 99 From Table 45, we conclude that, with base machine learning classifier j48, random forest, and Reptree, we used Adaboost and Bagging to make predictions more accurate on KDD99 and NSLKDD datasets, for binary and multi classes. J48, Random Forest, and Reptree with Adaboost achieved 99.90 true positive (TP) rate and 00.00% false positive (FP) rate, respectively.…”
Section: Methods Accuracy Detection Rate (%) Fr Rate (%)mentioning
confidence: 99%
“…DAR Ensemble [52] 78.88 N/A Naive Bayes-KNN-CF [53] 82.00 05.43 Feature Selection + SVM [54] 82.37 15.00 GAR Forest + Symmatrixal Uncertainity [55] 85.00 12.20 Bagging j48 [56] 84.25 02.79 PCA+PSO [57] 99 From Table 45, we conclude that, with base machine learning classifier j48, random forest, and Reptree, we used Adaboost and Bagging to make predictions more accurate on KDD99 and NSLKDD datasets, for binary and multi classes. J48, Random Forest, and Reptree with Adaboost achieved 99.90 true positive (TP) rate and 00.00% false positive (FP) rate, respectively.…”
Section: Methods Accuracy Detection Rate (%) Fr Rate (%)mentioning
confidence: 99%
“…Feature selection is a procedure to find a subset of significant features from the original set of features and reduces the number of irrelevant redundant features from dataset to enhance the performance of the classification and also decreases storing of memory space [8]. Feature selection helps in understanding data, reducing the effect of curse of dimensionality, reducing calculation requirement, enhancing the accuracy of learning and distinguishing which features may be relevant to a particular issue [9].…”
Section: Feature Selectionmentioning
confidence: 99%
“…According to the authors the technique can successfully detect the outlier and recognize the data as normal or abnormal. Sinwar and Dhaka [16] [17] has used a combination of PCA and PSO for feature selection and classification in intrusion detection for WSNs. PCA was used for feature reduction and selection while PSO was utilized to choose the ideal subset of feature from the PCA space.…”
Section: Related Workmentioning
confidence: 99%