Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2012
DOI: 10.1145/2133366.2133368
|View full text |Cite
|
Sign up to set email alerts
|

Emotional body language displayed by artificial agents

Abstract: Complex and natural social interaction between artificial agents (computer generated or robotic) and humans necessitates the display of rich emotions in order to be believable, socially relevant and accepted, and to generate the natural emotional responses that humans show in the context of social interaction, such as engagement or empathy. Whereas some robots use faces to display (simplified) emotional expressions, for other robots such as Nao, body language is the best medium available given their inability … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
46
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 61 publications
(48 citation statements)
references
References 32 publications
2
46
0
Order By: Relevance
“…We previously found that adults were able to interpret body poses displayed by the robot as conveying certain emotions, which is particularly relevant for the Nao robot used in ALIZ-E since it does not have facial articulation and as such is very limited in communicating emotion when not moving its body. We also noted that changing the robot's head position affects the expressiveness of the poses (Beck, Cañamero, & Bard, 2010;Beck, Stevens, Bard, & Cañamero, 2012). As with adults, it was found that moving the head upwards increased children's identification of emotions such as pride, happiness, and excitement, whereas moving the head downwards increased the correct identification for other displays (anger and sadness).…”
Section: Bodily Expression Of Emotionmentioning
confidence: 65%
“…We previously found that adults were able to interpret body poses displayed by the robot as conveying certain emotions, which is particularly relevant for the Nao robot used in ALIZ-E since it does not have facial articulation and as such is very limited in communicating emotion when not moving its body. We also noted that changing the robot's head position affects the expressiveness of the poses (Beck, Cañamero, & Bard, 2010;Beck, Stevens, Bard, & Cañamero, 2012). As with adults, it was found that moving the head upwards increased children's identification of emotions such as pride, happiness, and excitement, whereas moving the head downwards increased the correct identification for other displays (anger and sadness).…”
Section: Bodily Expression Of Emotionmentioning
confidence: 65%
“…eir results suggested that the perception of emotions are not a ected by the appearance of the character. In a similar setup, Beck et al [5] found that emotions were perceived more strongly on a real actor than on a virtual agent, and that stylized emotions were perceived more strongly than natural.…”
Section: Related Workmentioning
confidence: 97%
“…Emotional body laguage for the Nao robot was studied by [5] and [12]. For a non-humanoid sphere-shaped robot (Maru), sound, colors, and vibrations were used to express emotional behaviors [26].…”
Section: Related Workmentioning
confidence: 99%
“…Research has shown that human body language can be interpreted accurately without facial or vocal cues (Beck et al, 2012;de Gelder, 2006;Kleinsmith, De Silva, & Biachi-Berthouze, 2006). Based on Cassell (2000) and Vinayagamoorthy et al (2006), Beck et al (2013) robot body language can be classified into three different areas broadly used in the literature: posture, movement and proxemics.…”
Section: Emotional Expressionmentioning
confidence: 99%