2016
DOI: 10.4018/978-1-4666-8761-5.ch011
|View full text |Cite
|
Sign up to set email alerts
|

AdaBoost Algorithm with Single Weak Classifier in Network Intrusion Detection

Abstract: Recently machine learning based intrusion detection system developments have been subjected to extensive researches because they can detect both misuse detection and anomaly detection. In this paper, we propose an AdaBoost based algorithm for network intrusion detection system with single weak classifier. In this algorithm, the classifiers such as Bayes Net, Naïve Bayes and Decision tree are used as weak classifiers. KDDCup99 dataset is used in these experiments to demonstrate that boosting algorithm can great… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…AdaBoost can be used for both binary and multiclass classification. The algorithm can be formalized in both standard and original forms for both binary and multiclass classification [51,52]. Decision Tree (DT) classifier is typically used as the base estimator, and the corresponding train and run time complexity are O(T�Q�N) and O(T), respectively, where T is the number of base estimators.…”
Section: Adaptive Boosting (Adaboost)mentioning
confidence: 99%
“…AdaBoost can be used for both binary and multiclass classification. The algorithm can be formalized in both standard and original forms for both binary and multiclass classification [51,52]. Decision Tree (DT) classifier is typically used as the base estimator, and the corresponding train and run time complexity are O(T�Q�N) and O(T), respectively, where T is the number of base estimators.…”
Section: Adaptive Boosting (Adaboost)mentioning
confidence: 99%
“…63 An experimental comparison of the performance of decision trees against Naive Bayes and Bayes Net as base classifiers is presented in. 64 The authors reported that AdaBoost with decision trees as base classifier achieved the highest classification rate with lowest computational time.…”
Section: The Learning Algorithmmentioning
confidence: 99%
“…Sun et al compared Discrete, Real, and Gentle AdaBoost by analyzing the experimental results in license plate detection and explained that Gentle AdaBoost achieves better performance than the other two methods [51]. Comparison in [52] focused on comparing weak classifiers of AdaBoost constructed by Bayes net, naive Bayes, and decision trees, and it showed that decision trees are the best. A review systematically introduced AdaBoost variants proposed during 1999 to 2012.…”
Section: Introductionmentioning
confidence: 99%