2015
DOI: 10.1155/2015/198363
|View full text |Cite
|
Sign up to set email alerts
|

A Review of Feature Selection and Feature Extraction Methods Applied on Microarray Data

Abstract: We summarise various ways of performing dimensionality reduction on high-dimensional microarray data. Many different feature selection and feature extraction methods exist and they are being widely used. All these methods aim to remove redundant and irrelevant features so that classification of new instances will be more accurate. A popular source of data is microarrays, a biological platform for gathering gene expressions. Analysing microarrays can be difficult due to the size of the data they provide. In add… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
498
1
19

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 770 publications
(521 citation statements)
references
References 85 publications
3
498
1
19
Order By: Relevance
“…FE involves the construction of a new (smaller) feature set derived from the full dataset (Hira & Gillies, 2015). In this study, this dataset was constructed using the principle component analysis (PCA) tool within ArcMap v10.3.…”
Section: Methodsmentioning
confidence: 99%
“…FE involves the construction of a new (smaller) feature set derived from the full dataset (Hira & Gillies, 2015). In this study, this dataset was constructed using the principle component analysis (PCA) tool within ArcMap v10.3.…”
Section: Methodsmentioning
confidence: 99%
“…Moreover, using extracted features to predict the class of new cases sometimes leads to poor performance, which means that the method per forms well in training data but fails to classify the testing data well. Hira and Gillies (2015) comprehensively discuss the weaknesses and advantages of feature selection and extraction. This situation leads to the suggestion that the full features should be used in some cases.…”
Section: Introductionmentioning
confidence: 99%
“…Feature selection methods can be structured into three clusters: filter-based, wrapper-based and embedded [193]. All have some major drawbacks and we therefore propose a hybrid feature selection (HFS) algorithm.…”
Section: Feature Selectionmentioning
confidence: 99%
“…Deterministic wrappers such as sequential selection and its variants, sequential forward selection (SFS), add or remove features sequentially. They are computationally intensive ('greedy search') and have a risk of overfitting [193]. On the other hand, randomised methods such as simulated annealing, genetic algorithm and differential evaluation incorporate randomness into their search procedure to escape local minima but still tend to suffer from overfitting [193].…”
Section: Background Problemsmentioning
confidence: 99%
See 1 more Smart Citation