2020
DOI: 10.1007/978-981-15-8697-2_30
|View full text |Cite
|
Sign up to set email alerts
|

A Performance Evaluation of Loss Functions for Deep Face Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
20
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(23 citation statements)
references
References 15 publications
1
20
1
Order By: Relevance
“…The fact that almost all losses perform equally well shows that, contrary to what we thought, metric learning models perform no better than cross entropy, in contrast to other findings (Srivastava et al, 2019) on face verification. One possible explanation is that the AMI dataset may not contain enough examples or classes for these models to exploit.…”
Section: Resultscontrasting
confidence: 96%
See 1 more Smart Citation
“…The fact that almost all losses perform equally well shows that, contrary to what we thought, metric learning models perform no better than cross entropy, in contrast to other findings (Srivastava et al, 2019) on face verification. One possible explanation is that the AMI dataset may not contain enough examples or classes for these models to exploit.…”
Section: Resultscontrasting
confidence: 96%
“…Other studies have experimented with these methods in different domains with similar characteristics, like speaker verification (Bredin, 2017;Chung et al, 2018;Yadav and Rai, 2018), and even as an enhancement of BERT's sentence representations (Reimers and Gurevych, 2019) for semantic textual similarity. A recent study (Srivastava et al, 2019) has also focused on comparing these methods on face verification, showing that angular margin losses achieve superior performance.…”
Section: Introductionmentioning
confidence: 99%
“…Cross entropy loss is also known as softmax loss and effectively used in face detection or face recognition task. It can be defined as [20]…”
Section: Cross Entropy Lossmentioning
confidence: 99%
“…These efforts have been focused on two lines of study, on one side, the redesign of the identification loss function, and on the other side, the verification loss functions. The former is composed of Cross-Entropy (CE) loss with softmax output units [4,5,6] combined with a complementary loss such as Ring loss (RL) [7] or Center loss [8], and other studies have been focused on its variants such as Angular Softmax loss (A-Softmax) [9] or Additive Angular Margin loss (ArcFace) [10]. While the latter is based on metric learning approaches as triplet neural network [11,12], contrastive loss [13], partial AUC loss (pAUC) [14] or NeuralPLDA [15].…”
Section: Introductionmentioning
confidence: 99%