2015
DOI: 10.17485/ijst/2015/v8i35/77760
|View full text |Cite
|
Sign up to set email alerts
|

Principal Component Analysis based Feature Vector Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 8 publications
0
3
0
Order By: Relevance
“…A simple, non-parametric technique called PCA can be used to eliminate relevant information from large datasets and reduce the number of dimensions without sacrificing information [22]. It calculates the distance between two objects using the Euclidean distance concept.…”
Section: Principal Component Analysis (Pca)mentioning
confidence: 99%
“…A simple, non-parametric technique called PCA can be used to eliminate relevant information from large datasets and reduce the number of dimensions without sacrificing information [22]. It calculates the distance between two objects using the Euclidean distance concept.…”
Section: Principal Component Analysis (Pca)mentioning
confidence: 99%
“…Principal component analysis (Hidayat et al, 2011;Lhazmir et al, 2017;Murali, 2015) is a feature extraction method utilized for decreasing the features and dimensions for improving the computational ability of the model. The data set usually comprises several associated variables, signifying possibilities of various redundant variations.…”
Section: Feature Extraction -Principal Components Analysis (Pca)mentioning
confidence: 99%
“…The principal reason behind using PCA is to decrease the dimension of space complexity. The maximum number of principal components is the number of variables in the original space [18]. The linear transformation maps the original ndimensional space into an m-dimensional feature subspace.…”
Section: Principal Component Analysismentioning
confidence: 99%