2013
DOI: 10.1007/s12369-013-0193-z
|View full text |Cite
|
Sign up to set email alerts
|

Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 63 publications
(33 citation statements)
references
References 22 publications
0
32
0
1
Order By: Relevance
“…Based on Cassell (2000) and Vinayagamoorthy et al (2006), Beck et al (2013) robot body language can be classified into three different areas broadly used in the literature: posture, movement and proxemics. Body posture are specific positions a body takes at a certain point in time.…”
Section: Emotional Expressionmentioning
confidence: 99%
See 1 more Smart Citation
“…Based on Cassell (2000) and Vinayagamoorthy et al (2006), Beck et al (2013) robot body language can be classified into three different areas broadly used in the literature: posture, movement and proxemics. Body posture are specific positions a body takes at a certain point in time.…”
Section: Emotional Expressionmentioning
confidence: 99%
“…Proxemics refers to the distance between individuals during social interaction. Several humanoid robots can already express recognizable emotions using sounds, body movement and body posture (Beck et al, 2013;Haring, Bee, & Andre, 2011;Itoh et al, 2004;Searbeck & Bartneck, 2010). Yet, researchers must agree that robots actually do not have real emotions but act as if they have emotions by showing caricatured verbal and nonverbal expressions recognized and interpreted by humans as real emotions.…”
Section: Emotional Expressionmentioning
confidence: 99%
“…For safety and feasibility issues the use of some REEM's interactive resources was deliberately constrained during the autonomous operation. Consequently the robot's potentiality for verbal and non-verbal communication was reduced to notfacial/not-verbal behavior 2,24 . Specifically, arms and hands were blocked and stuck to the body for safety issues.…”
Section: The Robotmentioning
confidence: 99%
“…For instance, a novel robotics software designed to NAO [8] already implements dispositional or affective features of this robot tailoring its traits, moods, emotions or even attitudes toward human subject. A similar approach is considered in terms of developing child-robot interaction [2]. These authors focus on providing the humanoid NAO robot with the capacity to express emotions by its body postures and head position in order to convey emotions effectively [2].…”
Section: Introduc Onmentioning
confidence: 99%
“…A similar approach is considered in terms of developing child-robot interaction [2]. These authors focus on providing the humanoid NAO robot with the capacity to express emotions by its body postures and head position in order to convey emotions effectively [2]. Another approach to enhance the HRI interaction capacity of NAO combines its communicative behaviour with non-verbal gestures through the movements of hand and head as well as gaze orienting [6].…”
Section: Introduc Onmentioning
confidence: 99%