2011
DOI: 10.1016/j.imavis.2010.08.006
|View full text |Cite
|
Sign up to set email alerts
|

Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 61 publications
(39 citation statements)
references
References 25 publications
(35 reference statements)
0
39
0
Order By: Relevance
“…Moreover, he used pointing to identify an area on the screen to which an object should be moved. Instructions by gesture have subsequently been applied in robotics [9,16]. Often, speech commands are added to specify the intention of the gesture.…”
Section: Pointing Gesture Applicationsmentioning
confidence: 99%
“…Moreover, he used pointing to identify an area on the screen to which an object should be moved. Instructions by gesture have subsequently been applied in robotics [9,16]. Often, speech commands are added to specify the intention of the gesture.…”
Section: Pointing Gesture Applicationsmentioning
confidence: 99%
“…This angular offset corresponds to vertical offset . If the user points at the object with just a small pointing gesture [9], this vertical offset is relatively large. Even in this case, our method can adapt to a small pointing gesture.…”
Section: Estimation Of the Angular Offsetmentioning
confidence: 99%
“…Note that, from the comparison of the accuracy among (a), (b), and (c) camera views, the highest level of accuracy is from (b) quarter view, and the accuracy is as about the same as that as (c) side view and (a) frontal view. Many conventional methods [1,2,4,6,9,10] captured the pointing gestures from a frontal camera. We assume that the easiest way is to detect the users from a frontal camera.…”
Section: Experimental Environmentmentioning
confidence: 99%
See 1 more Smart Citation
“…Typical examples include: daily activities detection [1], medical rehabilitation and assistance [2], user identification and authentication [3], sign language interpretation [4], computer games [5], augmented reality [6], user interfaces [7] and robot control [8].…”
mentioning
confidence: 99%