2014
DOI: 10.1109/thms.2014.2303083
|View full text |Cite
|
Sign up to set email alerts
|

Designing and Evaluating a Social Gaze-Control System for a Humanoid Robot

Abstract: This paper describes a context-dependent social gaze-control system implemented as part of a humanoid social robot. The system enables the robot to direct its gaze at multiple humans who are interacting with each other and with the robot. The attention mechanism of the gaze-control system is based on features that have been proven to guide human attention: nonverbal and verbal cues, proxemics, the visual field of view, and the habituation effect. Our gaze-control system uses Kinect skeleton tracking together w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
71
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 76 publications
(72 citation statements)
references
References 36 publications
1
71
0
Order By: Relevance
“…During the dyadic task, the robot verbally interacted with the participant using a speech synthesizer based on the Acapela software 2 . Movement was tracked using the Kinect sensor and the Scene Analyzer software [37]. The aim to have the children interacting with the three robots was to explore the differences between the educational robots used by the two universities conducting the study.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…During the dyadic task, the robot verbally interacted with the participant using a speech synthesizer based on the Acapela software 2 . Movement was tracked using the Kinect sensor and the Scene Analyzer software [37]. The aim to have the children interacting with the three robots was to explore the differences between the educational robots used by the two universities conducting the study.…”
Section: Methodsmentioning
confidence: 99%
“…This information is later processed to extract significant social features, which are structured in a "metascene" data packet to be transmitted to rest of the modules. More information about the framework can be found at [37].…”
Section: ) Automatic Speech Recognition (Asr)mentioning
confidence: 99%
“…Finally, the meta-scene is serialized and sent over the network through its corresponding YARP port. Details of the scene analyzer algorithms and processes are reported in [30,39].…”
Section: Face Control Architecture Servicesmentioning
confidence: 99%
“…Gaze control is the control system of the robot's neck and eyes [39]. This module receives meta-scene objects containing a list of people in the field of view of the robot, each of them identified by a unique ID and associated with spatial coordinates (x,y,z).…”
Section: Face Control Architecture Servicesmentioning
confidence: 99%
“…The SceneAnalyzer builds upon several other libraries to deliver integrated recognition of multimodal features of the users and their behaviour [29,30]. The physiological signal acquisition module uses non-obtrusive and robust methods for obtaining information about the users physiological state: by integrating sensors in the robot or in the EASELscope tablet, information can be unobtrusively obtained without sensors worn or strapped to the body of user.…”
Section: Modules and Mapping To The Dac Architecturementioning
confidence: 99%