2020
DOI: 10.30658/hmc.1.3
|View full text |Cite
|
Sign up to set email alerts
|

Ontological Boundaries between Humans and Computers and the Implications for Human-Machine Communication

Abstract: In human-machine communication, people interact with a communication partner that is of a different ontological nature from themselves. This study examines how people conceptualize ontological differences between humans and computers and the implications of these differences for human-machine communication. Findings based on data from qualitative interviews with 73 U.S. adults regarding disembodied artificial intelligence (AI) technologies (voice-based AI assistants, automated-writing software) show that peopl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
54
0
4

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 95 publications
(62 citation statements)
references
References 22 publications
4
54
0
4
Order By: Relevance
“…Respondents imagined the consequences of human-robot interaction, and held a relatively open-minded or even positive attitude about a future with robots, though some held that the ontological boundary of robot as tool was necessary for such optimism. These findings show support for claims (e.g., Guzman, 2020) that human-robot ontology a a c d a b c a acc a c a d integration. AI-supported robotic technology presents great promise; however, its advancement challenges the ontological divide that has implications not only for human-machine interaction but also for self-identity and ultimately human-human relations.…”
Section: Resultssupporting
confidence: 83%
“…Respondents imagined the consequences of human-robot interaction, and held a relatively open-minded or even positive attitude about a future with robots, though some held that the ontological boundary of robot as tool was necessary for such optimism. These findings show support for claims (e.g., Guzman, 2020) that human-robot ontology a a c d a b c a acc a c a d integration. AI-supported robotic technology presents great promise; however, its advancement challenges the ontological divide that has implications not only for human-machine interaction but also for self-identity and ultimately human-human relations.…”
Section: Resultssupporting
confidence: 83%
“…Scholars also have shown that people’s conceptualizations of the nature of people and technology are wide-ranging (e.g. Edwards, 2018; Guzman, In Review), thus necessitating further study regarding how people discern between the nature of people and technology and the resulting implications of such ontological interpretations. This call for empirical research is not intended to diminish the importance of philosophical inquiry and cultural critique that has long focused on questions of ontology.…”
Section: An Hmc Research Agenda For Aimentioning
confidence: 99%
“…This cueing is multimodal, comprising visual (e.g., expressions and gazes; cf. Chesher & Andreallo, 2020), verbal (textual conveyances or vocalizations; Pradhan et al, 2019), and behavioral (e.g., social context awareness cues; Fraune et al, 2020) properties that denote a "living/nonliving," "born/made," and capability and agency distinctions (Guzman, 2020).…”
Section: Exploring Ontological-category Differences In Sexual Communimentioning
confidence: 99%
“…This study moves toward bridging that knowledge gap by experimentally examining sex-chat participants' experiences as a function of the chat partner's perceived ontological category-the prototypical class of entity to which a given agent is thought to belong. Shared category-membership (i.e., humans among themselves as a group) is understood as a heuristic for self-similar experience and implicit norming, and discrete category-membership (i.e., humans vs. machines) makes salient divergent origins, autonomies, and emotional and intellectual capabilities (Guzman, 2020). Therefore, it is possible that the experience of chatting with a machine partner could be procedurally and effectually different than with a human partner (cf.…”
mentioning
confidence: 99%