2011 IEEE International Conference on Robotics and Automation 2011
DOI: 10.1109/icra.2011.5980067
|View full text |Cite
|
Sign up to set email alerts
|

Towards joint attention for a domestic service robot - person awareness and gesture recognition using Time-of-Flight cameras

Abstract: Abstract-Joint attention between a human user and a robot is essential for effective human-robot interaction. In this work, we propose an approach to person awareness and to the perception of showing and pointing gestures for a domestic service robot. In contrast to previous work, we do not require the person to be at a predefined position, but instead actively approach and orient towards the communication partner. For perceiving showing and pointing gestures and for estimating the pointing direction a Time-of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2011
2011
2017
2017

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 30 publications
(11 citation statements)
references
References 28 publications
0
11
0
Order By: Relevance
“…Their results were obtained after testing the algorithm on a purposely built dataset of their own which included only the activities involving one individual. Droeschel et al presented a person awareness and gesture recognition approach for joint attention in a domestic environment [8]. They used time-of-flight cameras to classify between showing and pointing gestures and presented their results with high accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Their results were obtained after testing the algorithm on a purposely built dataset of their own which included only the activities involving one individual. Droeschel et al presented a person awareness and gesture recognition approach for joint attention in a domestic environment [8]. They used time-of-flight cameras to classify between showing and pointing gestures and presented their results with high accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…The combination of a depth sensor with a color camera has been exploited in several applications such as object recognition [48,108,2], person awareness, gesture recognition [31], simultaneous localization and mapping (SLAM) [10,64], robotized plant-growth measurement [1], etc. These methods mainly deal with the problem of noise in depth measurement, as examined in chapter 1, as well as with the low resolution of range data as compared to the color data.…”
Section: Related Workmentioning
confidence: 99%
“…For example, the robot can draw a user's attention to certain locations in the environment by simply pointing at it. Our robots can also perceive gestures such as pointing, showing of objects, or stop gestures [4]. The robots sense these gestures using the RGB-D camera.…”
Section: Gesture Recognition and Synthesismentioning
confidence: 99%