2017
DOI: 10.1007/978-3-319-59147-6_43
|View full text |Cite
|
Sign up to set email alerts
|

Deep Fisher Discriminant Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 7 publications
0
8
0
Order By: Relevance
“…Deep discriminant analysis metric learning methods use the idea of Fisher discriminant analysis (Fisher, 1936;Ghojogh et al, 2019b) in deep learning, for learning an embedding space which separates classes. Some of these methods are deep probabilistic discriminant analysis (Li et al, 2019), discriminant analysis with virtual samples (Kim & Song, 2021), Fisher Siamese losses (Ghojogh et al, 2020f), and deep Fisher discriminant analysis (Díaz-Vico et al, 2017;Díaz-Vico & Dorronsoro, 2019). The Fisher Siamese losses were already introduced in Section 5.3.13.…”
Section: Deep Discriminant Analysis Metric Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Deep discriminant analysis metric learning methods use the idea of Fisher discriminant analysis (Fisher, 1936;Ghojogh et al, 2019b) in deep learning, for learning an embedding space which separates classes. Some of these methods are deep probabilistic discriminant analysis (Li et al, 2019), discriminant analysis with virtual samples (Kim & Song, 2021), Fisher Siamese losses (Ghojogh et al, 2020f), and deep Fisher discriminant analysis (Díaz-Vico et al, 2017;Díaz-Vico & Dorronsoro, 2019). The Fisher Siamese losses were already introduced in Section 5.3.13.…”
Section: Deep Discriminant Analysis Metric Learningmentioning
confidence: 99%
“…F is the Frobenius norm, X ∈ R n×d is the rowwise stack of data points, Y := HEΠ −(1/2) ∈ R n×c where H := I − (1/n)11 ∈ R n×n is the centering matrix, E ∈ {0, 1} n×c is the one-hot-encoded labels stacked row-wise, Π ∈ R c×c is the diagonal matrix whose (l, l)-th element is the cardinality of the l-th class. Deep Fisher discriminant analysis (Díaz-Vico et al, 2017;Díaz-Vico & Dorronsoro, 2019) implements Eq. ( 183) by a nonlinear neural network with loss function:…”
Section: Deep Fisher Discriminant Analysismentioning
confidence: 99%
“…Hence, the FDA directions can be obtained by the generalized eigenvalue problem (S T , S W ) [47]. Note that some articles, such as [21], [22], [19], solve the generalized eigenvalue problem (S B , S T ) by considering another version of the Fisher criterion which is tr(U S B U )/tr(U S T U ). This criterion is obtained if we consider minimization of the inverse of Eq.…”
Section: Pca Fda and Spcamentioning
confidence: 99%
“…Hence, the SPCA directions are the eigenvectors of XHK y HX . Note that this term, restricted to a linear kernel for K y , is used as the between scatter in [22], [19], [20] hinting for a connection between SPCA and FDA if we compare the objectives in Eqs. ( 19) and (28).…”
Section: Pca Fda and Spcamentioning
confidence: 99%
See 1 more Smart Citation