2014
DOI: 10.5120/15932-5157
|View full text |Cite
|
Sign up to set email alerts
|

Target Class Supervised Feature Subsetting

Abstract: Dimensionality Reduction may result in contradicting effectsthe advantage of minimizing the number of features coupled with the disadvantage of information loss leading to incorrect classification or clustering. This could be the problem when one tries to extract all classes present in a high dimensional population. However in real life, it is often observed that one would not be interested in looking into all classes present in a high dimensional space but one would focus on one or two or few classes at any g… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…Extrapolation of features in a dataset extends the data while varying the variation in the feature space to be reduced. Similar effort has been made to illustrate dimensionality reduction in hyper spectral data [50]. Many random combinations of features have been explored to understand how the proposed incremental dimensionality reduction works.…”
Section: Simulation Of Big Feature Space Through Exploitation Of Irismentioning
confidence: 99%
“…Extrapolation of features in a dataset extends the data while varying the variation in the feature space to be reduced. Similar effort has been made to illustrate dimensionality reduction in hyper spectral data [50]. Many random combinations of features have been explored to understand how the proposed incremental dimensionality reduction works.…”
Section: Simulation Of Big Feature Space Through Exploitation Of Irismentioning
confidence: 99%