2021
DOI: 10.1155/2021/9947059
|View full text |Cite
|
Sign up to set email alerts
|

A GAN and Feature Selection-Based Oversampling Technique for Intrusion Detection

Abstract: In recent years, there have been numerous cyber security issues that have caused considerable damage to the society. The development of efficient and reliable Intrusion Detection Systems (IDSs) is an effective countermeasure against the growing cyber threats. In modern high-bandwidth, large-scale network environments, traditional IDSs suffer from a high rate of missed and false alarms. Researchers have introduced machine learning techniques into intrusion detection with good results. However, due to the scarci… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(24 citation statements)
references
References 38 publications
(43 reference statements)
0
24
0
Order By: Relevance
“…Table 1 shows the comparative results for the proposed ICF-GAN approach and two approaches from the literature RF-NIDS [ 53 ] and GAN-FS [ 54 ]. We can see that the proposed approach outperforms both RF-NIDS [ 53 ] and GAN-FS [ 54 ].…”
Section: Performance Evaluation and Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…Table 1 shows the comparative results for the proposed ICF-GAN approach and two approaches from the literature RF-NIDS [ 53 ] and GAN-FS [ 54 ]. We can see that the proposed approach outperforms both RF-NIDS [ 53 ] and GAN-FS [ 54 ].…”
Section: Performance Evaluation and Discussionmentioning
confidence: 99%
“…The Generative Adversarial Networks (GAN) and Feature Selection (GAN-FS) is an oversampling methodology with the perspective of data imbalance elimination [ 54 ]. GAN-FS is a hybrid of Gradient Penalty Wasserstein GAN (WGAN-GP), using Analysis of Variance (ANOVA) as a feature-selected rebalancer for low dimensional datasets and RF, scoped in countering training instability [ 54 ]. In Table 1 , a low Wasserstein distance in the model is indicative of data balancing in the training and test sets with an incremental increase [ 54 ].…”
Section: Performance Evaluation and Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…However, only ordinary GAN is used for sample generation, without considering the instability of the GAN, there are hidden dangers in the process of sample generation, and other datasets and models were not used to further validate its feasibility, which is not convincing. Liu et al [ 6 ] proposed a GAN-FS method to address feature redundancy. The model can select dataset features based on feature variance, eliminate the impact of redundant data and useless data on the model detection effect to a great extent, improve the accuracy and speed of detection, and uses a GAN to generate samples, which increase the number of samples and enhance the training effect.…”
Section: Introductionmentioning
confidence: 99%