2021
DOI: 10.1145/3432931
|View full text |Cite
|
Sign up to set email alerts
|

To "See" is to Stereotype

Abstract: Machine-learned computer vision algorithms for tagging images are increasingly used by developers and researchers, having become popularized as easy-to-use "cognitive services." Yet these tools struggle with gender recognition, particularly when processing images of women, people of color and non-binary individuals. Socio-technical researchers have cited data bias as a key problem; training datasets often over-represent images of people and contexts that convey social stereotypes. The social psychology literat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(11 citation statements)
references
References 63 publications
0
11
0
Order By: Relevance
“…However, one thing remains constant: individuals and communities that are at the margins of society are disproportionally impacted. Some examples include object detection, 44 search engine results, 45 recidivism, 46 gender recognition, 47 gender classification, 48,49 and medicine. 1 The findings of Wilson et.al., 44 for instance, demonstrate that object detection systems designed to predict pedestrians display higher error rates identifying dark-skinned pedestrians while light-skinned pedestrians are identified with higher precision.…”
Section: Knowing That Centers Human Relationsmentioning
confidence: 99%
“…However, one thing remains constant: individuals and communities that are at the margins of society are disproportionally impacted. Some examples include object detection, 44 search engine results, 45 recidivism, 46 gender recognition, 47 gender classification, 48,49 and medicine. 1 The findings of Wilson et.al., 44 for instance, demonstrate that object detection systems designed to predict pedestrians display higher error rates identifying dark-skinned pedestrians while light-skinned pedestrians are identified with higher precision.…”
Section: Knowing That Centers Human Relationsmentioning
confidence: 99%
“…Early work on the 2016 U.S. Presidential Election mentioned LGBTQ+ rights as a contentious issue, following the advancement of marriage equality and the repeal of both the U.S. Defense of Marriage Act 16 and the U.S. Military's Don't Ask Don't Tell policy 17 during the Obama administration between 2008 and 2016 [132,216]. This is part of a broader trend we observe of growing attention paid to online political discourse, with LGBTQ+ rights being one of the topics of debate [39,42,111,224].…”
Section: Queerness As Political And/or "Bad"mentioning
confidence: 99%
“…It provided states the ability to refuse to recognize same-sex marriages performed in other states. 17 Instituted in 1996, this policy prohibited U.S. Military personnel from discriminating against or harassing closeted or not-publicly out homosexual or bisexual service members. It also barred openly gay, lesbian, or bisexual people from military service.…”
Section: Queerness As Political And/or "Bad"mentioning
confidence: 99%
“…Hiring tools tend to disproportionately disadvantage women (Ajunwa et al, 2016). Additionally, the notion of gender that ML systems depend on is a fundamentally essentialist one that operationalizes gender in a trans-exclusive way resulting in disproportionate harm to trans people (Barlas et al, 2021;Hamidi et al, 2018;Keyes, 2018). Machine classification and prediction, thus, negatively impact individuals and groups at the margins the most.…”
Section: Imposed Determinability In Unequal Measuresmentioning
confidence: 99%