“…However, it remains to seen what the users-who are by now accustomed to the idea that these entities are designed as femalewill choose (for their still, after all, female-named assistant). As people are likely to assign gender to objectively non-gendered voices (Sutton, 2020), and voice assistants that are designed as or perceived to be female attract abusive behaviour (Cercas Rieser, 2019, 2018), designers may consider attempting to reddress the gender imbalance by designing assistants with servile roles to be male-presenting by default. While there have been examples, such as the BBC's Beeb (Walker, 2019), this remains an under-explored approach.…”
Section: Discussionmentioning
confidence: 99%
“…Nevertheless, people tend to personify nonhuman entities, including technological devices and virtual agents (Epley et al, 2007;Etzrodt and Engesser, 2021;Guthrie, 1995;Reeves and Nass, 1996). While some argue that this problem can be solved simply by using a 'genderless' voice (Meet Q), research shows that people will anyway assign binary genders to ambiguous voices (Sutton, 2020). 4 Thus, a genderless voice is redundant if other elements of an assistant's design cause it to be gendered.…”
Technology companies have produced varied responses to concerns about the effects of the design of their conversational AI systems. Some have claimed that their voice assistants are in fact not gendered or human-likedespite design features suggesting the contrary. We compare these claims to user perceptions by analysing the pronouns they use when referring to AI assistants. We also examine systems' responses and the extent to which they generate output which is gendered and anthropomorphic. We find that, while some companies appear to be addressing the ethical concerns raised, in some cases, their claims do not seem to hold true. In particular, our results show that system outputs are ambiguous as to the humanness of the systems, and that users tend to personify and gender them as a result.
“…However, it remains to seen what the users-who are by now accustomed to the idea that these entities are designed as femalewill choose (for their still, after all, female-named assistant). As people are likely to assign gender to objectively non-gendered voices (Sutton, 2020), and voice assistants that are designed as or perceived to be female attract abusive behaviour (Cercas Rieser, 2019, 2018), designers may consider attempting to reddress the gender imbalance by designing assistants with servile roles to be male-presenting by default. While there have been examples, such as the BBC's Beeb (Walker, 2019), this remains an under-explored approach.…”
Section: Discussionmentioning
confidence: 99%
“…Nevertheless, people tend to personify nonhuman entities, including technological devices and virtual agents (Epley et al, 2007;Etzrodt and Engesser, 2021;Guthrie, 1995;Reeves and Nass, 1996). While some argue that this problem can be solved simply by using a 'genderless' voice (Meet Q), research shows that people will anyway assign binary genders to ambiguous voices (Sutton, 2020). 4 Thus, a genderless voice is redundant if other elements of an assistant's design cause it to be gendered.…”
Technology companies have produced varied responses to concerns about the effects of the design of their conversational AI systems. Some have claimed that their voice assistants are in fact not gendered or human-likedespite design features suggesting the contrary. We compare these claims to user perceptions by analysing the pronouns they use when referring to AI assistants. We also examine systems' responses and the extent to which they generate output which is gendered and anthropomorphic. We find that, while some companies appear to be addressing the ethical concerns raised, in some cases, their claims do not seem to hold true. In particular, our results show that system outputs are ambiguous as to the humanness of the systems, and that users tend to personify and gender them as a result.
“…Auf Wunsch stellen sie den Wecker, überprüfen Wetterprognosen, sagen die Zeit an oder verschicken E-Mails. Sie übernehmen einfach gestrickte Haushalts-und Sekretariatsarbeiten -in den traditionellen Vorstellungen weibliche Domänen (Morley 2020;Purtill 2021;Sutton 2020). Gemäss Fessler (2017) werden Frauen so erneut zu Bediensteten, nun in digitaler Form.…”
Section: Aufgabenspektrumunclassified
“…Das Problem liegt tiefer. Eine nichtbinäre Stimme wie Q schliesst beispielsweise Sexismus nicht automatisch aus (Sutton 2020). Denn Geschlechterstereotype, die unerwünschtes Verhalten auslösen können, werden nicht nur durch Stimmen gefördert, sondern finden sich auch in den Persönlichkeiten und Verhaltensweisen, die für die Voice Assistants kreiert werden (Abschnitt 3).…”
ZusammenfassungDer Artikel legt die soziale und ethische Problematik weiblich konzipierter Voice Assistants dar. Weibliche Voice Assistants, die folgsam bis unterwürfig die ihnen aufgetragenen Arbeiten erledigen und selbst bei Beleidigungen höflich bleiben, können geschlechterstereotype Erwartungen an Frauen in der Gesellschaft hervorrufen und festigen. Angesichts der weltweit starken Zunahme von Voice Assistants ist dies ein Aspekt, der unbedingt thematisiert werden sollte. Die menschliche respektive weibliche Inszenierung der Technologie fusst auf grösstenteils bewussten Entscheidungen der beteiligten Design- und Entwicklungsteams. Sie programmieren geschlechterstereotype Reaktionen, setzen auf weibliche Stimmen und Namen. Die einfach gestrickten, hilfsbereiten weiblichen Voice Assistants können als hochmoderne Technologie Geschlechterstereotype nicht nur abbilden, sondern sie auch direkt zurück an ihre Nutzer:innen spielen. Damit verfügen sie über das Potenzial, längst ausgediente Rollenbilder in der Realität zu nähren. Der Artikel erörtert diese Problematik und analysiert hierfür aktuelle Literatur- und Medienbeiträge. Abschliessend werden mögliche Handlungsfelder für Technologiekonzerne, Forschung, Gesetzgebung und Zivilgesellschaft abgeleitet. Der Weg führt über eine stärkere Sensibilisierung aller an der Entwicklung Beteiligten und der Nutzer:innen. Nur so können Alternativen zu den gängigen Voice Assistants entstehen.
“…What constitutes being machinelike or what defines machineness is unclear. Some argue that conversational systems should be designed to eschew gender stereotypes we perceive with other people [6,28]. This includes considerations about voice (if using speech interfaces) and language.…”
Section: Mimicry Human-likeness and Machine Identitymentioning
Chatbots are popular machine partners for task-oriented and social interactions. Human-human computer-mediated communication research has explored how people express their gender and sexuality in online social interactions, but little is known about whether and in what way chatbots do the same. We conducted semi-structured interviews with 5 text-based conversational agents to explore this topic Through these interviews, we identified 6 common themes around the expression of gender and sexual identity: identity description, identity formation, peer acceptance, positive reflection, uncomfortable feelings and off-topic responses. Chatbots express gender and sexuality explicitly and through relation of experience and emotions, mimicking the human language on which they are trained. It is nevertheless evident that chatbots differ from human dialogue partners as they lack the flexibility and understanding enabled by lived human experience. While chatbots are proficient in using language to express identity, they also display a lack of authentic experiences of gender and sexuality.
CCS CONCEPTS• Human-centered computing → Natural language interfaces; • Social and professional topics → Sexual orientation; Gender.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.