2002
DOI: 10.1109/34.982904
|View full text |Cite
|
Sign up to set email alerts
|

Two variations on Fisher's linear discriminant for pattern recognition

Abstract: Abstract-Discriminants are often used in pattern recognition to separate clusters of points in some multidimensional "feature" space. This paper provides two fast and simple techniques for improving on the classification performance provided by Fisher's linear discriminant for two classes. Both of these methods are also extended to nonlinear decision surfaces through the use of Mercer kernels.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0
1

Year Published

2003
2003
2016
2016

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 73 publications
(26 citation statements)
references
References 6 publications
0
24
0
1
Order By: Relevance
“…Recently, enhanced methods have been proposed to improve learning performances when feature dimension is high compared to the training set size and reach non linear solutions [15][16][17][18]. Linear discriminant techniques aims at finding a discriminant vector w (also called Fisher vector) which maximize the Fisher criterion: …”
Section: Proposed Ridgelet Featuresmentioning
confidence: 99%
“…Recently, enhanced methods have been proposed to improve learning performances when feature dimension is high compared to the training set size and reach non linear solutions [15][16][17][18]. Linear discriminant techniques aims at finding a discriminant vector w (also called Fisher vector) which maximize the Fisher criterion: …”
Section: Proposed Ridgelet Featuresmentioning
confidence: 99%
“…Generally speaking, SVM performs well than KFD. However, Chakrabarti et al [88] and Cooke [89] have proved the equivalence between SVM and FLD on support vectors classification. Typically, two parts are necessary in tracking-by-detection approach: a) the generation of samples and labeled; b) update the classifier.…”
Section: Kernel Fisher Discriminantmentioning
confidence: 99%
“…Subspace features have been widely used in tracking for many years [89], [90], [91], [92], [93] due to its low-dimensional representation on the appearance of tracking target. Under the requirements of online and real-time applications, incremental subspace learning has been proposed recently [94].…”
Section: Kernel Subspace Learningmentioning
confidence: 99%
“…The most popularly used method is the Fisher's linear discriminant [7] of the data points in the node. This discriminant minimizes the ratio of the intra-class distance to the inter-class distance.…”
Section: Computation Of Discrimination Indexmentioning
confidence: 99%