Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2011 Ro-Man 2011
DOI: 10.1109/roman.2011.6005263
|View full text |Cite
|
Sign up to set email alerts
|

Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots

Abstract: The ability to display emotions is a key feature in human communication and also for robots that are expected to interact with humans in social environments. For expressions based on Body Movement and other signals than facial expressions, like Sound, no common grounds have been established so far. Based on psychological research on human expression of emotions and perception of emotional stimuli we created eight different expressional designs for the emotions Anger, Sadness, Fear and Joy, consisting of Body M… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
62
0
1

Year Published

2012
2012
2017
2017

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 90 publications
(66 citation statements)
references
References 15 publications
(14 reference statements)
0
62
0
1
Order By: Relevance
“…Proxemics refers to the distance between individuals during social interaction. Several humanoid robots can already express recognizable emotions using sounds, body movement and body posture (Beck et al, 2013;Haring, Bee, & Andre, 2011;Itoh et al, 2004;Searbeck & Bartneck, 2010). Yet, researchers must agree that robots actually do not have real emotions but act as if they have emotions by showing caricatured verbal and nonverbal expressions recognized and interpreted by humans as real emotions.…”
Section: Emotional Expressionmentioning
confidence: 99%
“…Proxemics refers to the distance between individuals during social interaction. Several humanoid robots can already express recognizable emotions using sounds, body movement and body posture (Beck et al, 2013;Haring, Bee, & Andre, 2011;Itoh et al, 2004;Searbeck & Bartneck, 2010). Yet, researchers must agree that robots actually do not have real emotions but act as if they have emotions by showing caricatured verbal and nonverbal expressions recognized and interpreted by humans as real emotions.…”
Section: Emotional Expressionmentioning
confidence: 99%
“…Emotional body laguage for the Nao robot was studied by [5] and [12]. For a non-humanoid sphere-shaped robot (Maru), sound, colors, and vibrations were used to express emotional behaviors [26].…”
Section: Related Workmentioning
confidence: 99%
“…However, many challenges need to be addressed in order to meet such a requirement (Baker et al, 2009a;Moore, 2013Moore, , 2015, not least how to evolve the complexity of voice-based interfaces from simple structured dialogs to more flexible conversational designs without confusing the user (Bernsen et al, 1998;McTear, 2004;Lopez Cozar Delgado and Araki, 2005;Phillips and Philips, 2006;Moore, 2016b). In particular, seminal work by Nass and Brave (2005) showed how attention needs to be paid to users' expectations [e.g., selecting the "gender" of a system's voice (Crowell et al, 2009)], and this has inspired work on "empathic" vocal robots (Breazeal, 2003;Fellous and Arbib, 2005;Haring et al, 2011;Eyssel et al, 2012;Lim and Okuno, 2014;Crumpton and Bethel, 2016). On the other hand, user interface experts, such as Balentine (2007), have argued that such agents should be clearly machines rather than emulations of human beings, particularly to avoid the "uncanny valley effect" (Mori, 1970), whereby mismatched perceptual cues can lead to feelings of repulsion (Moore, 2012).…”
Section: Spoken Language Systemsmentioning
confidence: 99%