Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2013
DOI: 10.1145/2470654.2466246
|View full text |Cite
|
Sign up to set email alerts
|

Benevolent deception in human computer interaction

Abstract: Though it has been asserted that "good design is honest,"[42] deception exists throughout human-computer interaction research and practice. Because of the stigma associated with deception-in many cases rightfully so-the research community has focused its energy on eradicating malicious deception, and ignored instances in which deception is positively employed. In this paper we present the notion of benevolent deception, deception aimed at benefitting the user as well as the developer. We frame our discussion u… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
37
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(37 citation statements)
references
References 27 publications
0
37
0
Order By: Relevance
“…They proposed to broaden the human-computer interaction agenda 'to consider the currently unfamiliar idea that the active deception of one user by another can be a valid approach to interaction design' (p. 576). Furthermore, like Rowe (2007) and Adar et al (2013) they also point to what may considered to be an ethical use of deception: computer systems that create deceptions in order to maintain security. Hence Adar et al suggest that there may be 'benevolent deception, deception aimed at benefitting the user as well as the developer' (Adar et al 2013, abstract).…”
Section: Computer Science Robotics and The Art Of Magicmentioning
confidence: 99%
“…They proposed to broaden the human-computer interaction agenda 'to consider the currently unfamiliar idea that the active deception of one user by another can be a valid approach to interaction design' (p. 576). Furthermore, like Rowe (2007) and Adar et al (2013) they also point to what may considered to be an ethical use of deception: computer systems that create deceptions in order to maintain security. Hence Adar et al suggest that there may be 'benevolent deception, deception aimed at benefitting the user as well as the developer' (Adar et al 2013, abstract).…”
Section: Computer Science Robotics and The Art Of Magicmentioning
confidence: 99%
“…Although there has been success in automatically generating communications that take emotions into account, for example, in a joke generator for children (Binsted et al, 2006)-technology has yet to be used to generate communications that deliberately lie about a sender's emotional state. Across HCI, researchers have ''tiptoed around'' benign deceit as an area of inquiry (Adar, Tan, & Teevan, 2013), and user models have yet to explicitly incorporate deceit as an adaptive dimension. However, when up to one third of our daily social interactions involve deceit (Hancock et al, 2004), the use of deceit as an adaptive strategy is relevant in the quest for increasingly naturalistic computer-generated communications.…”
Section: Implementation and Evaluationmentioning
confidence: 99%
“…Many systems already deceive-as described elsewhere (Adar et al, 2013)-but they are not presented as doing so. Deliberately designing deceit into systems presents rich research opportunities, but the word deceit should be used cautiously with users due to its pejorative implications.…”
Section: Benign Deceit As An Adaptive Factormentioning
confidence: 99%
“…For example, manipulation of users' mental models of systems in ways that benefit both systems' designers and end-users were documented by Adar et al [1]. Ambiguity, often promoted through deception, gives people space for flexible interpretation [29], and to tell stories they need to in order to preserve face and reputation [7,10].…”
Section: Introductionmentioning
confidence: 99%
“…Here, we are interested in exploring the complex contexts in which deception might take place, to consider not just cases where the system lies to a user [1] or computer mediated communication where one user lies to others, but situations where systems lie to each other about users; where a user needs to lie to one audience but not another; where tools or systems might protect a person from disclosure to other systems or tools. As Nissenbaum puts it:…”
Section: Introductionmentioning
confidence: 99%