2020 28th European Signal Processing Conference (EUSIPCO) 2021
DOI: 10.23919/eusipco47968.2020.9287722
|View full text |Cite
|
Sign up to set email alerts
|

Demographic Bias: A Challenge for Fingervein Recognition Systems?

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(18 citation statements)
references
References 18 publications
2
16
0
Order By: Relevance
“…A considerable amount of work has been done to investigate (demographic) bias and fairness in face recognition systems, e.g., [17]- [19], and potentially sensitive face-related tasks, such as age estimation [20], face image quality assessment [21], privacy protection [22], and face-morph detection [23] to name a few examples. Similar studies were also presented for fingerprints [24], [25], finger vein [26], and palm print [27] recognition systems among others. While much of this work aimed at identifying the presence of bias in various (learningbased) biometric systems and algorithms (e.g., [17], [20], [24], [26]), a small number of works also tried to investigate causes of the observed performance differentials for different data groups, e.g., [19], [28].…”
Section: Background and Related Work A Bias And Fairness In Biometricssupporting
confidence: 65%
See 1 more Smart Citation
“…A considerable amount of work has been done to investigate (demographic) bias and fairness in face recognition systems, e.g., [17]- [19], and potentially sensitive face-related tasks, such as age estimation [20], face image quality assessment [21], privacy protection [22], and face-morph detection [23] to name a few examples. Similar studies were also presented for fingerprints [24], [25], finger vein [26], and palm print [27] recognition systems among others. While much of this work aimed at identifying the presence of bias in various (learningbased) biometric systems and algorithms (e.g., [17], [20], [24], [26]), a small number of works also tried to investigate causes of the observed performance differentials for different data groups, e.g., [19], [28].…”
Section: Background and Related Work A Bias And Fairness In Biometricssupporting
confidence: 65%
“…Similar studies were also presented for fingerprints [24], [25], finger vein [26], and palm print [27] recognition systems among others. While much of this work aimed at identifying the presence of bias in various (learningbased) biometric systems and algorithms (e.g., [17], [20], [24], [26]), a small number of works also tried to investigate causes of the observed performance differentials for different data groups, e.g., [19], [28]. The insight and observations made by these studies provided critical understanding of the bias-related behavior of existing biometric algorithms and contributed towards various bias mitigation measures, e.g., [29]- [31].…”
Section: Background and Related Work A Bias And Fairness In Biometricssupporting
confidence: 65%
“…In recent years, the research field in biometrics have also evaluated the bias and fairness of these systems as well as developing novel methods for bias mitigation. Most of the studies that we have analysed do not take into account the most relevant state-of-the-art definitions of fairness in machine learning, and some of them argued that their experimental evaluation suggests lack of bias in score distributions Drozdowski et al (2021). Moreover, recent studies of fairness in machine learning have proved to be generally impossible to satisfy several fairness criteria simultaneously Garg et al (2020); Chouldechova (2017); Zhao and Gordon (2019); Kleinberg (2018).…”
Section: A Theoretical Approach Towards Fairness In Biometricsmentioning
confidence: 99%
“…finger, palm or human eye veins Uhl et al (2020). These four systems are proposed in Drozdowski et al (2021) to assess demographic bias by differences on score distribution statistics (mean and standard deviation) of genuine and impostor attempts are evaluated. The conclusion achieve is that statistically significant biases in score distributions do not exist and the authors proposed to evaluate this framework in the future with more individuals, given that the number of subjects in each of the databases is very small.…”
Section: The Impossibility Of Unbiased Biometric Systemsmentioning
confidence: 99%
See 1 more Smart Citation