2016 IEEE 6th International Conference on Advanced Computing (IACC) 2016
DOI: 10.1109/iacc.2016.16
|View full text |Cite
|
Sign up to set email alerts
|

Comparative Study of Feature Subset Selection Methods for Dimensionality Reduction on Scientific Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
21
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(21 citation statements)
references
References 8 publications
0
21
0
Order By: Relevance
“…Feature selection aims to select a feature subset from the original set of features based on a/the feature's relevance and redundancy. Originally evaluation methods in feature selection are divided into four kinds: filter, wrapper, embedded [10,14,18], and hybrid [20,29]. Recently, another type of evaluation method is developed, i.e., ensemble feature selection [30,31]. "…”
Section: Feature Selection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Feature selection aims to select a feature subset from the original set of features based on a/the feature's relevance and redundancy. Originally evaluation methods in feature selection are divided into four kinds: filter, wrapper, embedded [10,14,18], and hybrid [20,29]. Recently, another type of evaluation method is developed, i.e., ensemble feature selection [30,31]. "…”
Section: Feature Selection Methodsmentioning
confidence: 99%
“…Feature selection is utilized to reduce the dimensionality impact on the dataset through finding the subset of feature which efficiently define the data [18,19]. It selects the important and relevant features to the mining task from the input data and removes redundant and irrelevant features [20,21].…”
Section: A Feature Selectionmentioning
confidence: 99%
“…As a result of a sequential feature selection (SFS) [14], the most useful solution proved to be combining global feature systems: First Order Statistics (FOS), Contour First Order Statistics (CFOS) and Local Range Filter (LRF). The types of the FOS features selected by SFS are mean, standard deviation, kurtosis and skewness.…”
Section: Resultsmentioning
confidence: 99%
“…In my previous paper [9] gives a comparative study of traditional feature subset selection techniques like Sequential Forward Selection (SFS), and Sequential Floating Forward Selection (SFFS) with RSFS. The SFS follows the procedure to find next best feature compared to the existing feature set as in forward selection(top down approach) [10], but it suffers with nesting problem, means; once a feature is retained, it cannot be discarded; Sequential Backward Selection(SBS), removing the worst feature as in backward selection(bottom up approach) suffers with more computation problem than SFS and nesting problem; SFFS is to avoid the problem of nesting of features by doing both(SFS & SBS) in a recursive manner [11].…”
Section: Random Subset Feature Selection Comparison With Traditional mentioning
confidence: 99%