DOI: 10.1007/978-3-540-35488-8_1
|View full text |Cite
|
Sign up to set email alerts
|

An Introduction to Feature Extraction

Abstract: This chapter introduces the reader to the various aspects of feature extraction covered in this book. Section 1 reviews definitions and notations and proposes a unified view of the feature extraction problem. Section 2 is an overview of the methods and results presented in the book, emphasizing novel contributions. Section 3 provides the reader with an entry point in the field of feature extraction by showing small revealing examples and describing simple but effective algorithms. Finally, Section 4 introduces… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

20
6,691
1
152

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 4,913 publications
(6,864 citation statements)
references
References 15 publications
(8 reference statements)
20
6,691
1
152
Order By: Relevance
“…general linear model with a binomial cost and logit link function) (Stauffer, 2008), the two classes, 1) interactive and 2) computer control, were predicted based on the forward sequentially selected features (Guyon and Elisseeff, 2003). The classifier thus selected a subset of features from the data matrix that best predicted the class they belonged to, by sequentially selecting features until there was no improvement in prediction.…”
Section: Multivariate Classification Analysismentioning
confidence: 99%
“…general linear model with a binomial cost and logit link function) (Stauffer, 2008), the two classes, 1) interactive and 2) computer control, were predicted based on the forward sequentially selected features (Guyon and Elisseeff, 2003). The classifier thus selected a subset of features from the data matrix that best predicted the class they belonged to, by sequentially selecting features until there was no improvement in prediction.…”
Section: Multivariate Classification Analysismentioning
confidence: 99%
“…This approach may miss an opportunity to identify combinations of predictors that together have good predictive power, when each individual predictor has little or no predictive power by itself [263]. This is particularly the case if non-linear interactions exist among multiple predictors.…”
Section: Using Machine Learning Methodsmentioning
confidence: 99%
“…The vast number of feature subsets necessitates applying heuristic search techniques, with various accuracy/computation tradeoffs (Guyon and Elisseeff, 2003). Filtering methods apply knowledge of the class labels to evaluate the discrimination power either of individual genes (univariate) or collections of genes (multivariate), based on criteria such as signal-to-noise ratio, correlation measures, and mutual information, before classifier training.…”
Section: Predictive Classificationmentioning
confidence: 99%
“…These methods can improve predictive power by capturing higher order (and complex, nonlinear) joint feature effects. Perhaps the simplest example is the 'noisy XOR problem', for which two individual features and their linear combinations have no discrimination power, but a simple nonlinear combination is perfectly discriminating (Duda et al, 2001;Guyon and Elisseeff, 2003; Figure 2). …”
Section: Predictive Classificationmentioning
confidence: 99%
See 1 more Smart Citation