2001
DOI: 10.1109/72.914517
|View full text |Cite
|
Sign up to set email alerts
|

An introduction to kernel-based learning algorithms

Abstract: Abstract-This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and kernel principal component analysis (PCA), as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis (VC) theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by finally … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
1,115
0
27

Year Published

2005
2005
2019
2019

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 3,029 publications
(1,202 citation statements)
references
References 89 publications
1
1,115
0
27
Order By: Relevance
“…Nonlinear methods to compute Fisher discriminants have been developed in recent years using kernel-based approaches (Muller et al 2001). Similarly, one might ask whether using nonlinear dynamical EEG measures instead of our linear ones might have improved the results of our linear or an alternative nonlinear discrimination method.…”
Section: Discussionmentioning
confidence: 99%
“…Nonlinear methods to compute Fisher discriminants have been developed in recent years using kernel-based approaches (Muller et al 2001). Similarly, one might ask whether using nonlinear dynamical EEG measures instead of our linear ones might have improved the results of our linear or an alternative nonlinear discrimination method.…”
Section: Discussionmentioning
confidence: 99%
“…Typically cross-validation techniques are applied here, where an appropriate range of the hyperparameters is scanned (e.g. ref 7). A proper model selection is the key to a successful application of modern learning techniques; however, at the same time it is typically computationally very expensive.…”
Section: Methodsmentioning
confidence: 99%
“…The most frequently used kernel functions are the polynomial kernel, the sigmoid kernel, and the radial basis kernel functions (RBF) [31][32][33]. Since the RBF is usually used in classification using support vector machines [34] and was found by Huang et al [21] to exhibit the best performance, it is used in this work.…”
Section: Classification By Svmmentioning
confidence: 99%