2020
DOI: 10.1007/s13347-020-00415-6
|View full text |Cite
|
Sign up to set email alerts
|

The Whiteness of AI

Abstract: This paper focuses on the fact that AI is predominantly portrayed as white—in colour, ethnicity, or both. We first illustrate the prevalent Whiteness of real and imagined intelligent machines in four categories: humanoid robots, chatbots and virtual assistants, stock images of AI, and portrayals of AI in film and television. We then offer three interpretations of the Whiteness of AI, drawing on critical race theory, particularly the idea of the White racial frame. First, we examine the extent to which this Whi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
58
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 142 publications
(73 citation statements)
references
References 36 publications
1
58
0
1
Order By: Relevance
“…Accurate age estimation has applications to a range of settings, from facilitating medical diagnosis to early resolution of criminal cases. Although AI algorithms for face recognition (FR) and estimation of age are being developed (e.g., Escalante-B and Wiskott 2020 ; Zhu et al 2018 ), most of these algorithms are biased against non-Caucasian races (Cave and Dihal 2020 ), consequently affecting the classification of emotional facial expressions. In this regard, Rhue ( 2018 ) found that FR algorithms attribute more negative emotions to black men’s faces than white men’s faces.…”
Section: Discussionmentioning
confidence: 99%
“…Accurate age estimation has applications to a range of settings, from facilitating medical diagnosis to early resolution of criminal cases. Although AI algorithms for face recognition (FR) and estimation of age are being developed (e.g., Escalante-B and Wiskott 2020 ; Zhu et al 2018 ), most of these algorithms are biased against non-Caucasian races (Cave and Dihal 2020 ), consequently affecting the classification of emotional facial expressions. In this regard, Rhue ( 2018 ) found that FR algorithms attribute more negative emotions to black men’s faces than white men’s faces.…”
Section: Discussionmentioning
confidence: 99%
“…Although AI algorithms for face recognition (FR) and estimation of age are being developed (e.g., Escalante-B. and Wiskott, 2020;Zhu et al, 2018), most of these algorithms are biased against non-Caucasian races (Cave & Dihal, 2020), consequently affecting the classification of emotional facial expressions. In this regard, Rhue (2018) found that FR algorithms attribute more negative emotions to black men's faces than white men's faces.…”
Section: Discussionmentioning
confidence: 99%
“…We plan to unpack all the above with an eye towards a collective re-imagining of the computational sciences. To do this, we implore computational scientists to be aware of their fi elds' histories (Cave & Dihal 2020;Roberts, BareketShavit, Dollins, Goldie, & Mortenson 2020;Saini 2019;Syed 2020;Winston 2020) and we propose that through such an awakening we can begin to forge a decolonised future. We also hope our article encourages researchers to consciously avoid repeating previous mistakes, some of which are crimes against humanity, like eugenics (Saini 2019).…”
mentioning
confidence: 99%
“…Ultimately, our goal is to make inroads upon radically decolonised computational sciences (cf. Birhane 2019;Cave & Dihal 2020).…”
mentioning
confidence: 99%