The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
Proceedings of the 2nd Conference on Conversational User Interfaces 2020
DOI: 10.1145/3405755.3406123
|View full text |Cite
|
Sign up to set email alerts
|

Gender Ambiguous, not Genderless

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
20
0
2

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 36 publications
(23 citation statements)
references
References 22 publications
1
20
0
2
Order By: Relevance
“…However, it remains to seen what the users-who are by now accustomed to the idea that these entities are designed as femalewill choose (for their still, after all, female-named assistant). As people are likely to assign gender to objectively non-gendered voices (Sutton, 2020), and voice assistants that are designed as or perceived to be female attract abusive behaviour (Cercas Rieser, 2019, 2018), designers may consider attempting to reddress the gender imbalance by designing assistants with servile roles to be male-presenting by default. While there have been examples, such as the BBC's Beeb (Walker, 2019), this remains an under-explored approach.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, it remains to seen what the users-who are by now accustomed to the idea that these entities are designed as femalewill choose (for their still, after all, female-named assistant). As people are likely to assign gender to objectively non-gendered voices (Sutton, 2020), and voice assistants that are designed as or perceived to be female attract abusive behaviour (Cercas Rieser, 2019, 2018), designers may consider attempting to reddress the gender imbalance by designing assistants with servile roles to be male-presenting by default. While there have been examples, such as the BBC's Beeb (Walker, 2019), this remains an under-explored approach.…”
Section: Discussionmentioning
confidence: 99%
“…Nevertheless, people tend to personify nonhuman entities, including technological devices and virtual agents (Epley et al, 2007;Etzrodt and Engesser, 2021;Guthrie, 1995;Reeves and Nass, 1996). While some argue that this problem can be solved simply by using a 'genderless' voice (Meet Q), research shows that people will anyway assign binary genders to ambiguous voices (Sutton, 2020). 4 Thus, a genderless voice is redundant if other elements of an assistant's design cause it to be gendered.…”
Section: Bias Statementmentioning
confidence: 99%
“…Auf Wunsch stellen sie den Wecker, überprüfen Wetterprognosen, sagen die Zeit an oder verschicken E-Mails. Sie übernehmen einfach gestrickte Haushalts-und Sekretariatsarbeiten -in den traditionellen Vorstellungen weibliche Domänen (Morley 2020;Purtill 2021;Sutton 2020). Gemäss Fessler (2017) werden Frauen so erneut zu Bediensteten, nun in digitaler Form.…”
Section: Aufgabenspektrumunclassified
“…Das Problem liegt tiefer. Eine nichtbinäre Stimme wie Q schliesst beispielsweise Sexismus nicht automatisch aus (Sutton 2020). Denn Geschlechterstereotype, die unerwünschtes Verhalten auslösen können, werden nicht nur durch Stimmen gefördert, sondern finden sich auch in den Persönlichkeiten und Verhaltensweisen, die für die Voice Assistants kreiert werden (Abschnitt 3).…”
Section: Neue Skripte Und Konzepteunclassified
“…What constitutes being machinelike or what defines machineness is unclear. Some argue that conversational systems should be designed to eschew gender stereotypes we perceive with other people [6,28]. This includes considerations about voice (if using speech interfaces) and language.…”
Section: Mimicry Human-likeness and Machine Identitymentioning
confidence: 99%