2011 Ro-Man 2011
DOI: 10.1109/roman.2011.6005285
|View full text |Cite
|
Sign up to set email alerts
|

A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction

Abstract: Gesture is an important feature of social interaction, frequently used by human speakers to illustrate what speech alone cannot provide, e.g. to convey referential, spatial or iconic information. Accordingly, humanoid robots that are intended to engage in natural human-robot interaction should produce speech-accompanying gestures for comprehensible and believable behavior. But how does a robot's non-verbal behavior influence human evaluation of communication quality and the robot itself? To address this resear… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
40
0
1

Year Published

2012
2012
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 78 publications
(42 citation statements)
references
References 15 publications
1
40
0
1
Order By: Relevance
“…• In Condition 3, the incongruent multimodal (speechgesture) condition, the robot presented the participant pantomimic gesture conveying the act of opening the cupboard; deictic gesture pointing at designated position [37] with the identical set of nine verbal instructions used in condition 1. Again, in addition, they were accompanied by a total of 21 gestures, out of which ten gestures (47.6%) semantically matched the verbal instruction, while the remaining eleven gestures (52.4%) were semantically non-matching, e.g., the robot occasionally said "put it up there" but pointed downwards.…”
Section: Conditionsmentioning
confidence: 99%
“…• In Condition 3, the incongruent multimodal (speechgesture) condition, the robot presented the participant pantomimic gesture conveying the act of opening the cupboard; deictic gesture pointing at designated position [37] with the identical set of nine verbal instructions used in condition 1. Again, in addition, they were accompanied by a total of 21 gestures, out of which ten gestures (47.6%) semantically matched the verbal instruction, while the remaining eleven gestures (52.4%) were semantically non-matching, e.g., the robot occasionally said "put it up there" but pointed downwards.…”
Section: Conditionsmentioning
confidence: 99%
“…Along with speech, non-verbal behavior plays an important role in communication and interaction. Numerous researchers have been working on developing gestures for agents [1][2][3] and robots [4,5]. However, how an agent/robot can most effectively use gestures to interact with people is not well studied.…”
Section: Introductionmentioning
confidence: 99%
“…The agent was rated as more natural, warmhearted, agile and committed when presenting self-touching gestures. Salem et al [5] found that a robot is evaluated more positively when non-verbal behaviors such as hand and arm gestures accompany speech. Neff et al [9,10] conducted experiments to understand how the Big Five traits of emotional stability and extraversion correlate with changes in verbal and nonverbal behavior.…”
Section: Introductionmentioning
confidence: 99%
“…Our tendency to make these subjective judgments makes it crucial that all aspects of a robot's appearance, from the broad shape of the body to the subtle tilt of the eyes, are refined until they convey the image desired. In addition to appearance, the motion of the robot is critical to human perception; motion parameters such as acceleration and curvature [Saerbeck and Bartneck 2010], music synchrony [Avrunin et al 2011], inclusion of gestures [Salem et al 2011] or expressions [Blow et al 2006], and even unintended cues like motor noise [Hegel et al 2011] all affect a human's perception of the robot. Mori postulated that our acceptance of a robot form increases with its increasing similarity to humans, up to a point at which even slight deviations cause a sense of uncanniness [Mori 1970].…”
Section: Chaptermentioning
confidence: 99%