2023
DOI: 10.24908/ss.v21i2.16015
|View full text |Cite
|
Sign up to set email alerts
|

Human-First, Please: Assessing Citizen Views and Industrial Ambition for Emotional AI in Recommender Systems

Abstract: This paper qualitatively explores the views of diverse members of the British public on applications of biometric emotional AI technologies patented by two globally dominant consumer-facing recommender systems, Amazon and Spotify. Examining Amazon and Spotify patents for biometric profiling of users’ emotions, disposition, and behaviour to offer them tailored services, ads, and products from their wider platforms, this paper points to industrial ambition regarding emotional AI. Little is known about ordinary p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 36 publications
(62 reference statements)
0
2
0
Order By: Relevance
“…Emotion profiling through biometrics includes techniques such as facial coding of expressions, voice analytics, eye-tracking, and wearables that sense skin responses, muscle activity, heart activity, respiration, and brain activity. Recent years have seen sector-specific trials of such techniques in security, education and workplaces across the world ( McStay, 2018 , 2020b , 2023 ; ARTICLE 19, 2021 ; Mantello et al, 2021 ; Urquhart et al, 2022 ); and biometric (voice-based) emotional AI in voice assistants is also envisaged in the patents of Amazon (the world’s largest online marketplace) to offer users highly tailored services, ads and products from the wider platform ( Bakir et al, 2023a ). Beyond trials and patents, use of biometrics to gauge emotions is also being rolled out in consumer-facing sectors such as in cars to improve cabin experience and safety ( McStay and Urquhart, 2022b ); in wearables to help users manage their mental health and day ( McStay, 2018 ); and in robotic toys to adapt and respond to users’ emotions, and to display the toy’s “moods” ( McStay and Rosner, 2021 ).…”
Section: Undermining Autonomymentioning
confidence: 99%
See 1 more Smart Citation
“…Emotion profiling through biometrics includes techniques such as facial coding of expressions, voice analytics, eye-tracking, and wearables that sense skin responses, muscle activity, heart activity, respiration, and brain activity. Recent years have seen sector-specific trials of such techniques in security, education and workplaces across the world ( McStay, 2018 , 2020b , 2023 ; ARTICLE 19, 2021 ; Mantello et al, 2021 ; Urquhart et al, 2022 ); and biometric (voice-based) emotional AI in voice assistants is also envisaged in the patents of Amazon (the world’s largest online marketplace) to offer users highly tailored services, ads and products from the wider platform ( Bakir et al, 2023a ). Beyond trials and patents, use of biometrics to gauge emotions is also being rolled out in consumer-facing sectors such as in cars to improve cabin experience and safety ( McStay and Urquhart, 2022b ); in wearables to help users manage their mental health and day ( McStay, 2018 ); and in robotic toys to adapt and respond to users’ emotions, and to display the toy’s “moods” ( McStay and Rosner, 2021 ).…”
Section: Undermining Autonomymentioning
confidence: 99%
“…For instance, sociological literature (largely US-focused) highlights how racism (among other things) on data-driven discrimination shapes people’s experiences of data-driven systems ( Fisher, 2009 ; Madden et al, 2017 ; Benjamin, 2019 ). Regarding the group that self-identify as disabled, we reasoned that emotion profiling systems could have much to offer disabled people who may, for instance, be more reliant on technology to enable communication and employment, but that emotion profiling technologies can also risk prescribing value-laden benchmarks of what constitutes “normality” ( Bakir et al, 2023a ). We posited that separate focus groups based on ethnicity and disability might more readily surface unique insights on such factors that could be missed in more general focus groups.…”
Section: Stage 1: Focus Groups Using Design Fiction and Contravisionmentioning
confidence: 99%