2018
DOI: 10.1016/j.dcan.2017.09.009
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised multi-layered clustering model for intrusion detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(18 citation statements)
references
References 21 publications
0
16
0
Order By: Relevance
“…The study concluded that random tree classifier was the winner under various performance metrics. RUSBoost [120,137,146] Not mentioned [137] LogitBoost [94,103,120] GentleBoost [53,120] LPBoost [62] RealBoost [53,56] MultiBoost [56] CatBoost [107,147] ModestBoost [53] Random subspace [46,137,146] Rotation forest [55,56,70,85,110,111] Tree Maximum probability voting [68,106,135] Product probability voting [68,135] Sum probability voting [76] Minimum probability voting [68,145] Median probability voting [106,145] Bayesian [98] Al-Jarrah et al [80] proposed a semi-supervised multi-layered clustering (SMLC) model for IDSs. The performance of SMLC was compared with that of supervised ensemble ML models and a well-known semi-supervised model (i.e., tri-training).…”
Section: Mapping Selected Studies By Ensemble Methodsmentioning
confidence: 99%
“…The study concluded that random tree classifier was the winner under various performance metrics. RUSBoost [120,137,146] Not mentioned [137] LogitBoost [94,103,120] GentleBoost [53,120] LPBoost [62] RealBoost [53,56] MultiBoost [56] CatBoost [107,147] ModestBoost [53] Random subspace [46,137,146] Rotation forest [55,56,70,85,110,111] Tree Maximum probability voting [68,106,135] Product probability voting [68,135] Sum probability voting [76] Minimum probability voting [68,145] Median probability voting [106,145] Bayesian [98] Al-Jarrah et al [80] proposed a semi-supervised multi-layered clustering (SMLC) model for IDSs. The performance of SMLC was compared with that of supervised ensemble ML models and a well-known semi-supervised model (i.e., tri-training).…”
Section: Mapping Selected Studies By Ensemble Methodsmentioning
confidence: 99%
“…Thus, such learning could be useful in situations where there is an absence of large amounts of labelled data; an example is the case of a photo archive where not all images are labelled, thus consisting of some unlabeled data, one of which is used to enhance the accuracy of IDS [78,79]. In another study, two semi-supervised Spectral Graph Transducers used for classification and the Gaussian Fields technique were employed for detecting attacks that are not known, while a semi-unsupervised clustering approach, known as the Metric Pairwise Constrained K-Means (MPCK-Means), was employed for improving the manner in which detection systems perform [80].…”
Section: Semi-supervised Learningmentioning
confidence: 99%
“…The anomalies are the data instances whose characteristics deviate significantly from the built model. Al‐Jarrah et al 97 proposed a semisupervised multilayered clustering (SMLC) model for the detection of network attacks. SMLC assumes that the resulting clusters of the K‐Means algorithm depend on the chosen number of clusters.…”
Section: Anomaly‐based Intrusion Detection Techniquesmentioning
confidence: 99%