Modern conversational agents such as Alexa and Google Assistant represent significant progress in speech recognition, natural language processing, and speech synthesis. But as these agents have grown more realistic, concerns have been raised over how their social nature might unconsciously shape our interactions with them. Through a survey of 500 voice assistant users, we explore whether users' relationships with their voice assistants can be quantified using the same metrics as social, interpersonal relationships; as well as if this correlates with how much they trust their devices and the extent to which they anthropomorphise them. Using Knapp's staircase model of human relationships, we find that not only can human-device interactions be modelled in this way, but also that relationship development with voice assistants correlates with increased trust and anthropomorphism.CCS Concepts: • Human-centered computing → Empirical studies in interaction design; Natural language interfaces.
The work reported here stems from our deep belief that improved human engineering can add significantly to the acceptance and use of computer technology.In particular, this report describes an experiment to test the hypothesis that certain features of natural language provide a useful guide for the human engineering of interactive command languages. The goal was to establish that a syntax employing familiar, descriptive, everyday words and well-formed English phrases contributes to a language that can be easily and effectively used. Users with varying degrees of interactive computing experience used two versions of an interactive text editor; one with an English-based command syntax in the sense described above, the other with a more notational syntax. Performance differences strongly favored the English-hased editor.
Connected devices in the home represent a potentially grave new privacy threat due to their unfettered access to the most personal spaces in people's lives. Prior work has shown that despite concerns about such devices, people often lack sufficient awareness, understanding, or means of taking effective action. To explore the potential for new tools that support such needs directly we developed Aretha, a privacy assistant technology probe that combines a network disaggregator, personal tutor, and firewall, to empower end-users with both the knowledge and mechanisms to control disclosures from their homes. We deployed Aretha in three households over six weeks, with the aim of understanding how this combination of capabilities might enable users to gain awareness of data disclosures by their devices, form educated privacy preferences, and to block unwanted data flows. The probe, with its novel affordancesand its limitations-prompted users to co-adapt, finding new control mechanisms and suggesting new approaches to address the challenge of regaining privacy in the connected home.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.