2011 5th International Conference on Application of Information and Communication Technologies (AICT) 2011
DOI: 10.1109/icaict.2011.6110912
|View full text |Cite
|
Sign up to set email alerts
|

Recognition of different datasets using PCA, LDA, and various classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 9 publications
0
11
0
Order By: Relevance
“…It also gives us the chance to select the useful components by separating them from useless components using the component variances. This selection process is an important step to take because having these less-ranked components will affect the learning curves and the classification process [12]. As a result, the study had approximately thirteen new principal components.…”
Section: E Principal Component Analysis (Pca)mentioning
confidence: 99%
See 1 more Smart Citation
“…It also gives us the chance to select the useful components by separating them from useless components using the component variances. This selection process is an important step to take because having these less-ranked components will affect the learning curves and the classification process [12]. As a result, the study had approximately thirteen new principal components.…”
Section: E Principal Component Analysis (Pca)mentioning
confidence: 99%
“…The study had to use the PCA to identify underlying hidden fusions and use the orthogonal transformation to transform the MF correlated features into a new set of values of linearly uncorrelated attributes. By implementing the PCA, the study was able to exploit and reduce the MF dimensions to a new set of principal components that could be used in representation, evaluation, and, more importantly, classification [10], [12].…”
Section: E Principal Component Analysis (Pca)mentioning
confidence: 99%
“…On the hyperplane, the distances among vectors of the same class are minimal, and the distance of different class centers is maximum. In this way, LDA offers more class separability and draws a decision region between the given classes [48].…”
Section: ) Classifiermentioning
confidence: 99%
“…It finds the project axes to project the data samples of the dissimilar classes to be far from each other and to close the data samples of the same class. Hence, the LDA generates a linear combination of the data samples, which achieves the largest differences between the classes [23]. At the same time, the SVM determine decision boundaries according to the decision planes, which separate the set of features of the different classes.…”
Section: Performance Evaluation Comparative Studymentioning
confidence: 99%