2019
DOI: 10.1145/3359246
|View full text |Cite
|
Sign up to set email alerts
|

How Computers See Gender

Abstract: Investigations of facial analysis (FA) technologies-such as facial detection and facial recognition-have been central to discussions about Artificial Intelligence's (AI) impact on human beings. Research on automatic gender recognition, the classification of gender by FA technologies, has raised potential concerns around issues of racial and gender bias. In this study, we augment past work with empirical data by conducting a systematic analysis of how gender classification and gender labeling in computer vision… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
30
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 131 publications
(30 citation statements)
references
References 70 publications
0
30
0
Order By: Relevance
“…This is relevant for state management and policy, i.e., to pinpoint places where intervention or allocation of resources is needed. However, the tendency of classification practices towards the erasure of residual categories [16] can cause tension and even be harmful for individuals who remain unseen or misclassified by data-driven systems [19,71].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This is relevant for state management and policy, i.e., to pinpoint places where intervention or allocation of resources is needed. However, the tendency of classification practices towards the erasure of residual categories [16] can cause tension and even be harmful for individuals who remain unseen or misclassified by data-driven systems [19,71].…”
Section: Discussionmentioning
confidence: 99%
“…These systems are often expected to minimize human intervention in decision-making and thus be neutral and value-free [23,51,73]. However, previous research has shown that they may contain biases that lead to discrimination and exclusion in several domains such as credit [37], the job market [70], facial recognition systems [19,45,71], algorithmic filtering [4,62], and even advertisement [1]. Critical academic work has furthermore discussed the politics involved in data-driven systems [27,30,56] and highlighted the importance of investigating the capitalistic logics woven into them [20,26,81].…”
Section: Introductionmentioning
confidence: 99%
“…Our language-based analyses rely on names to assign gender to individuals and are not well suited to study such forms of diversity. Additionally, potential harms from assigning specific labels to individuals need to be considered ( Scheuerman et al, 2019 ; Blodgett et al, 2020 ). Nevertheless, studies of diversity in the news are valuable.…”
Section: Discussionmentioning
confidence: 99%
“…A large segment of the discipline of "affective computing," involving tracking and analyzing a variety of bio-signals like heart rate, is grounded in Basic Emotion Theory [22,83,84,86]. Models powering facial analysis technologies are designed around Ekman's thesis regarding the legibility of basic emotions through facial expression [35], and analyze large databases of human faces [18], using various AI techniques to estimate emotional expression (along with other characteristics like age and gender) [97]. These systems are increasingly commercially in areas such as human resource management, advertising [69] and education [11].…”
Section: Motivational Theories Of Emotion and Aimentioning
confidence: 99%