The 2003 Congress on Evolutionary Computation, 2003. CEC '03.
DOI: 10.1109/cec.2003.1299873
|View full text |Cite
|
Sign up to set email alerts
|

Weighted feature extraction using a genetic algorithm for intrusion detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(21 citation statements)
references
References 6 publications
0
19
0
Order By: Relevance
“…(Liao and Vumeri, 2002) [10] proposed k-nearest neighbor method for feature selection in KDD99 dataset. (Middlemiss and Dick, 2003) [9] used KNN in conjunction with genetic algorithm to classify the normal and abnormal behavior. The 41 attributes rendered by KDD99 dataset were ranked by [14] through support vector machines and they again combined SVM and neural networks in [15] to rank the features.…”
Section: Related Workmentioning
confidence: 99%
“…(Liao and Vumeri, 2002) [10] proposed k-nearest neighbor method for feature selection in KDD99 dataset. (Middlemiss and Dick, 2003) [9] used KNN in conjunction with genetic algorithm to classify the normal and abnormal behavior. The 41 attributes rendered by KDD99 dataset were ranked by [14] through support vector machines and they again combined SVM and neural networks in [15] to rank the features.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, the PSO is commonly be used for searching the optimal solution [23], [25]. The best position of each particle and best position of the group are estimated by the fitness function Fit( ).…”
Section: B the Proposed Pwknnmentioning
confidence: 99%
“…Typical KNN is a fast and non-training classification algorithm that computes correlations of each known vector and uses nearest ksamples to classify unknown vectors. Each feature of KNN is equal importance which means each feature equally affect classification results [22], [23]. For example, an unknown vector is in a feature space, which contains three types of vectors, Types A, B and C, in Fig.1.…”
Section: Introductionmentioning
confidence: 99%
“…In past studies, some anomaly-based NIDSs focused on the feature weighting and selection, such as Mukkamala and Sung (2002), Sung and Mukkamala (2003), Lee et al (2006), Abbes et al (2004), Stein et al (2005), Hofman et al (2004), Middlemiss and Dick, (2003), Liao and Vemuri (2002). Mukkamala and Sung (2002) applied the Support Vector Machine (SVM) technique to rank the 41 features provided by KDD CUP99 (The UCI KDD Archive).…”
Section: Introductionmentioning
confidence: 99%