2003
DOI: 10.1016/s0925-2312(03)00433-8
|View full text |Cite
|
Sign up to set email alerts
|

A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
225
0
6

Year Published

2014
2014
2019
2019

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 448 publications
(245 citation statements)
references
References 22 publications
0
225
0
6
Order By: Relevance
“…We performed other experiments, not reported in this paper, without PCA and there is an average decrease of 2% in the accuracy. Our architectural choice is also based on the results published by [43], which shows that there is an improvement when PCA is applied before SVM, and by [44], which shows that PCA before SVM can be used to speed-up training time with no major impact in the accuracy.…”
Section: E Classificationmentioning
confidence: 99%
“…We performed other experiments, not reported in this paper, without PCA and there is an average decrease of 2% in the accuracy. Our architectural choice is also based on the results published by [43], which shows that there is an improvement when PCA is applied before SVM, and by [44], which shows that PCA before SVM can be used to speed-up training time with no major impact in the accuracy.…”
Section: E Classificationmentioning
confidence: 99%
“…In [6], GA was in use for features and parameters selection for intrusion detection model. The Support Vector Machines (SVM) has been applied as an intrusive analysis engine.…”
Section: Related Workmentioning
confidence: 99%
“…PCA purposely is to reduce the dimension of the data while retaining as much as possible of the variation present in the original dataset. It provides a way of identifying patterns in data, and expressing the data in such a way as to highlight their similarities and differences [6,11]. However, PCA here was used to transform the input vectors to the new search space.…”
Section: Gpcs Kmentioning
confidence: 99%
“…ICA is an unsupervised method in which prior knowledge about the class labels is unknown. It is a more advantageous than the PCA [4] because PCA seeks the projection which has maximum variance whereas ICA [5] seeks the projection which has maximum independence. That is the reason in this paper ICA is used for the maximization of independence due to which classification accuracy gets increased.…”
Section: Independent Component Analysismentioning
confidence: 99%
“…Hybrid dimensionality reduction method [1] is the combination of these two approaches which uses both the criteria. In supervised learning LDA [2] and Support Vector Machine [3] are the tow famous dimensionality reduction techniques, whereas in unsupervised learning PCA [4] and Independent Component Analysis [5] and last there is combination supervised and unsupervised approach which is known as Support Vector Machine and Independent Component Analysis.…”
Section: Introductionmentioning
confidence: 99%