2021
DOI: 10.11591/ijai.v10.i1.pp110-120
|View full text |Cite
|
Sign up to set email alerts
|

Toward a deep learning-based intrusion detection system for IoT against botnet attacks

Abstract: <span id="docs-internal-guid-345787a5-7fff-6d93-73dd-f99a81d82f61"><span>The massive network traffic data between connected devices in the internet of things have taken a big challenge to many traditional intrusion detection systems (IDS) to find probable security breaches. However, security attacks lean towards unpredictability. There are numerous difficulties to build up adaptable and powerful IDS for IoT in order to avoid false alerts and ensure a high recognition precision against attacks, espe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
4

Relationship

3
7

Authors

Journals

citations
Cited by 59 publications
(42 citation statements)
references
References 21 publications
0
36
0
Order By: Relevance
“…Chauhan and Atulkar [ 39 ] suggested the Light Gradient Boosting Machine (LGBM) model because it outperformed RF, Extra Tree (ET), Gradient Boost (GB) and XGBoost models. In [ 40 ], the Convolutional Neural Network (CNN) model outperformed the RNN, LSTM and GRU models. Huong et al [ 41 , 42 ] proposed a low-complexity edge-cloud DNN model, which achieved better performance than the kNN, DT, RF and SVM models.…”
Section: Review Of Related Workmentioning
confidence: 99%
“…Chauhan and Atulkar [ 39 ] suggested the Light Gradient Boosting Machine (LGBM) model because it outperformed RF, Extra Tree (ET), Gradient Boost (GB) and XGBoost models. In [ 40 ], the Convolutional Neural Network (CNN) model outperformed the RNN, LSTM and GRU models. Huong et al [ 41 , 42 ] proposed a low-complexity edge-cloud DNN model, which achieved better performance than the kNN, DT, RF and SVM models.…”
Section: Review Of Related Workmentioning
confidence: 99%
“…Where k represents the subset of the original dataset (K) that optimizes f1 and f2 (the objectives). The second part involves the evaluation of the selected feature subsets based on accuracy (an established performance evaluation metric), as provided in (3). Accuracy calculation requires the division of the instances that are classified correctly by all instances.…”
Section: 𝑓1 = |𝑘|mentioning
confidence: 99%
“…Deep learning models necessitates a lot of data and time while training [17], [18]. Transfer learning is a technique that benefits from an already trained weight on big datasets for a long period of time and transfer this knowledge [19] to the targeted model.…”
Section: Transfer Learningmentioning
confidence: 99%