1995
DOI: 10.1214/aos/1176324456
|View full text |Cite
|
Sign up to set email alerts
|

Penalized Discriminant Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
476
0
2

Year Published

2000
2000
2011
2011

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 715 publications
(479 citation statements)
references
References 16 publications
1
476
0
2
Order By: Relevance
“…Curve discrimination Different alternatives have been proposed for the curve discrimination problem, although mainly for univariate functions. Two of the first methods were a regularized version of LDA called penalized discriminant analysis (PDA) proposed by Hastie et al (1995), and a generalized linear regression approach proposed by Marx and Eilers (1999). More recently, some non-parametric alternatives have been proposed, such as the kernel one by Ferraty and Vieu (2003), the k-NN one by Burba et al (2009) or the local linear one by Barrientos-Marin et al (2010).…”
Section: Comparison Of the Three Approachesmentioning
confidence: 99%
“…Curve discrimination Different alternatives have been proposed for the curve discrimination problem, although mainly for univariate functions. Two of the first methods were a regularized version of LDA called penalized discriminant analysis (PDA) proposed by Hastie et al (1995), and a generalized linear regression approach proposed by Marx and Eilers (1999). More recently, some non-parametric alternatives have been proposed, such as the kernel one by Ferraty and Vieu (2003), the k-NN one by Burba et al (2009) or the local linear one by Barrientos-Marin et al (2010).…”
Section: Comparison Of the Three Approachesmentioning
confidence: 99%
“…We, instead, adopt a direct, many-fold cross-validation approach ignoring smoothing at the regression stage by using misclassification rate as the figure-of-merit during crossvalidation. This is particularly convenient in the current formulation because, as shown in [24], the Mahalanobis distance from class centroids (adjusted for class prior distribution) can be simply and quickly computed in canonical variates by the following expression:…”
Section: Sparse Multinomial Kernel Discriminant Analysis (Smkda)mentioning
confidence: 99%
“…Our intention was then to apply a regressor selection technique such as OLS to the least-squares stage. Subsequently, a more convenient formulation [21] based on penalized optimal scoring was shown via canonical correlation analysis to be able to provide exactly the canonical variates [24], i.e. identical direction and scaling to LDA.…”
Section: Introductionmentioning
confidence: 99%
“…In Sec. 4, the classification behavior 15 of the combination of three feature extraction methods (no feature extraction, FDA and quadratic FDA) together with a classifier based on Linear Discriminant Analy-17 sis (LDA), will be compared. Six experiments will be performed using six standard data sets encountered in pattern recognition literature.…”
mentioning
confidence: 99%
“…LDA is directly applied to data. Next, FDA is used as feature extraction 15 and finally QFDA is used a feature extraction method. From Table 1 we can deduce that feature extraction positively affects the clas-17 sifier behavior and QFDA introduces an improvement of 4% in the classification rate if compared with the cases with no feature extraction or when using FDA.…”
mentioning
confidence: 99%