2020
DOI: 10.1109/tnnls.2019.2906302
|View full text |Cite
|
Sign up to set email alerts
|

Deep Least Squares Fisher Discriminant Analysis

Abstract: https://repositorio.uam.es Esta es la versión de autor del artículo publicado en: This is an author produced version of a paper published in:

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(16 citation statements)
references
References 38 publications
(31 reference statements)
0
16
0
Order By: Relevance
“…Second, in this study we only focus on binary classification, it is also necessary to further explore the algorithms to solve imbalanced multiclass classification tasks. To achieve such goal, we will refer to the definition of the total AUC for multiclass classification in [58] and modify the relevant objective functions accordingly. Third, it is worthwhile to extend the direct AUC optimization to different algorithms in the family of kernel ridge regression to further improve the classification performances.…”
Section: Discussionmentioning
confidence: 99%
“…Second, in this study we only focus on binary classification, it is also necessary to further explore the algorithms to solve imbalanced multiclass classification tasks. To achieve such goal, we will refer to the definition of the total AUC for multiclass classification in [58] and modify the relevant objective functions accordingly. Third, it is worthwhile to extend the direct AUC optimization to different algorithms in the family of kernel ridge regression to further improve the classification performances.…”
Section: Discussionmentioning
confidence: 99%
“…Deep discriminant analysis metric learning methods use the idea of Fisher discriminant analysis (Fisher, 1936;Ghojogh et al, 2019b) in deep learning, for learning an embedding space which separates classes. Some of these methods are deep probabilistic discriminant analysis (Li et al, 2019), discriminant analysis with virtual samples (Kim & Song, 2021), Fisher Siamese losses (Ghojogh et al, 2020f), and deep Fisher discriminant analysis (Díaz-Vico et al, 2017;Díaz-Vico & Dorronsoro, 2019). The Fisher Siamese losses were already introduced in Section 5.3.13.…”
Section: Deep Discriminant Analysis Metric Learningmentioning
confidence: 99%
“…F is the Frobenius norm, X ∈ R n×d is the rowwise stack of data points, Y := HEΠ −(1/2) ∈ R n×c where H := I − (1/n)11 ∈ R n×n is the centering matrix, E ∈ {0, 1} n×c is the one-hot-encoded labels stacked row-wise, Π ∈ R c×c is the diagonal matrix whose (l, l)-th element is the cardinality of the l-th class. Deep Fisher discriminant analysis (Díaz-Vico et al, 2017;Díaz-Vico & Dorronsoro, 2019) implements Eq. ( 183) by a nonlinear neural network with loss function:…”
Section: Deep Fisher Discriminant Analysismentioning
confidence: 99%
“…Suppose µ (r) ∈ R d , c, n, and U ∈ R d×d denote the mean of r-th class, the number of classes, the total sample size, and the projection matrix in FDA, respectively. Although some methods solve FDA using least squares problem [5,6], the regular FDA [2] maximizes the Fisher criterion [7]:…”
Section: Fisher and Kernel Discriminant Analysismentioning
confidence: 99%