2021
DOI: 10.48550/arxiv.2112.06522
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Anatomizing Bias in Facial Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…2019 Face Recognition Vendor test documents lower female accuracy rates across a broad range of algorithms and datasets 4 . Similarly, lower accuracy rates for females have been obtained for various in-house deep learningbased face recognition systems [5,4,30]. The cause and effect analysis suggests gendered hairstyles resulting in facial occlusion, make-up, and inherent lower variability between different female faces over males to be the factors contributing to lower performance for females [4,5].…”
Section: Gendered Differences In Facial Analyticsmentioning
confidence: 97%
See 1 more Smart Citation
“…2019 Face Recognition Vendor test documents lower female accuracy rates across a broad range of algorithms and datasets 4 . Similarly, lower accuracy rates for females have been obtained for various in-house deep learningbased face recognition systems [5,4,30]. The cause and effect analysis suggests gendered hairstyles resulting in facial occlusion, make-up, and inherent lower variability between different female faces over males to be the factors contributing to lower performance for females [4,5].…”
Section: Gendered Differences In Facial Analyticsmentioning
confidence: 97%
“…This draws attention to fairness and bias in AI-based facial analytics where unintended consequences from biased systems call for a thorough examination of the datasets and models [18,5,17,8,4]. Most of the published research in this domain suggests low performance for women, and dark-skinned people for facial attribute-based classification systems such as gender and age [8,17,30,24], and face recognition [5,4]. As biased datasets produce biased models, many of the efforts have been focused on developing gender and race-balanced datasets for various facial-analysis based applications.…”
Section: Celebdf Distributionmentioning
confidence: 99%