2021
DOI: 10.1109/access.2020.3046246
|View full text |Cite
|
Sign up to set email alerts
|

An Approach for Optimizing Ensemble Intrusion Detection Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 25 publications
(12 citation statements)
references
References 59 publications
0
12
0
Order By: Relevance
“…This algorithm is used in cyber-physical social networksReview on classifiers: Classifiers includes Naïve Bayes, Random Forest [19], J-48 and bagging [23]are widely used data mining classifiers to classify a class.Random_Forest: categorizes all the samples in the trained dataset and produces good results but it is over-fit to probability predictions.Naïve Bayes: is a simple supervised procedure that returns likelihoods. j48 [27]:This algorithm generates the decision tree for the taken dataset. The accuracy level obtained using J48 [26] is best compared to the Naive Bayes classifier.…”
Section: Materials and Methodologiesmentioning
confidence: 99%
“…This algorithm is used in cyber-physical social networksReview on classifiers: Classifiers includes Naïve Bayes, Random Forest [19], J-48 and bagging [23]are widely used data mining classifiers to classify a class.Random_Forest: categorizes all the samples in the trained dataset and produces good results but it is over-fit to probability predictions.Naïve Bayes: is a simple supervised procedure that returns likelihoods. j48 [27]:This algorithm generates the decision tree for the taken dataset. The accuracy level obtained using J48 [26] is best compared to the Naive Bayes classifier.…”
Section: Materials and Methodologiesmentioning
confidence: 99%
“…A clustering algorithm evaluates posts suspected of containing inhumane words. D. Stiawan et al, (2021), the author proposes a new dataset IDS that can be used to identify the best-fitting selected features as critical features. A method for developing an optimally integrated IDS to achieve this goal is developedIt is used to select six parameters namely Information Gain (IG), Gain Ratio (GR), Symmetric Uncertainty (SU), Relief-F (R-F), One-R (OR) and Chi-Square (CS).…”
Section: Our Contributions Can Be Summarized As Followsmentioning
confidence: 99%
“…The best models were Random Forest with an accuracy of 99.86% and 22 features and J48 with an accuracy of 99.87% and 52 features. D. Stiawan et al [60] introduced an approach for constructing ensemble IDS using six ranked feature selection techniques, namely, Information Gain, Gain Ratio, Symmetrical Uncertainty, Relief-F, One-R and Chi-Square ensemble with four classifiers such as Bayesian Network, Naïve Bayesian, J48 and SOM, and validated using Hold-up, K-fold approaches. Experimental results were obtained on Weka using the ITD-UTM dataset.…”
Section: Machine Learning-based Idsmentioning
confidence: 99%