2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom) 2012
DOI: 10.1109/coginfocom.2012.6421936
|View full text |Cite
|
Sign up to set email alerts
|

Integration of gestures and speech in human-robot interaction

Abstract: Abstract-We present an approach to enhance the interaction abilities of the Nao humanoid robot by extending its communicative behavior with non-verbal gestures (hand and head movements, and gaze following). A set of nonverbal gestures were identified that Nao could use for enhancing its presentation and turn-management capabilities in conversational interactions. We discuss our approach for modeling and synthesizing gestures on the Nao robot. A scheme for system evaluation that compares the values of users' ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
26
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 46 publications
(26 citation statements)
references
References 7 publications
0
26
0
Order By: Relevance
“…Selection of a gesture in a particular communicative context is based on the dialogue situation (give/elicit feedback, inform, greet) and on the task (continue/change/stop the topic). More detailed description of the robot's gesturing can be found in [14].…”
Section: Gesturing and Presentation Of Newinfomentioning
confidence: 99%
See 1 more Smart Citation
“…Selection of a gesture in a particular communicative context is based on the dialogue situation (give/elicit feedback, inform, greet) and on the task (continue/change/stop the topic). More detailed description of the robot's gesturing can be found in [14].…”
Section: Gesturing and Presentation Of Newinfomentioning
confidence: 99%
“…The evaluation is reported in [2] and [14]. Following the evaluation scheme proposed in [8], the users were asked to fill in a questionnaire twice, first to capture their expectations of the system before interacting with it, and then to measure their experience of the system after their interaction with it.…”
Section: Implementation and Evaluationmentioning
confidence: 99%
“…This section describes the motivation for adding gestures to Nao, and their design and synthesis. More comprehensive description of enhancing Nao with gestures and posture shifts can be found in [4].…”
Section: A Gesturesmentioning
confidence: 99%
“…A set of non-verbal gestures were designed in order to enhance Nao's presentation and turn-management capabilities [7]. These apply Kendon's [8] notion of gesture families.…”
Section: Gesturesmentioning
confidence: 99%
“…The Open Hand Supine ("palm up") and Open Hand Prone ("palm down") families have their own semantic themes related to giving ideas as well as presenting, explaining, summarizing vs. stopping and halting, respectively [9]. For the presentation capabilities, a set of presentation gestures were identified to mark the topic, the end of a sentence or paragraph, plus beat gestures and head nods to attract attention to hyperlinks (new information), and head nodding as backchannels (see more in [7]). For the turn-management capabilities, the following approach was applied: Nao speaks and observes the human partner; after each information chunk that Nao presents, the human is invited to signal continuation (phrases like 'continue' or 'stop'); Nao asks explicit feedback depending on user's turn; the robot may also gesture, stop, etc.…”
Section: Gesturesmentioning
confidence: 99%