2007
DOI: 10.1109/tie.2007.892728
|View full text |Cite
|
Sign up to set email alerts
|

Natural Interface Using Pointing Behavior for Human–Robot Gestural Interaction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
38
0
1

Year Published

2009
2009
2018
2018

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 115 publications
(39 citation statements)
references
References 8 publications
0
38
0
1
Order By: Relevance
“…[33]- [35]) and estimating the referred-to target has been addressed by several authors in recent years, aiming at applications in robotics (e.g. [36]- [39]), smart environments (e.g. [40]) and wearable visual interfaces (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…[33]- [35]) and estimating the referred-to target has been addressed by several authors in recent years, aiming at applications in robotics (e.g. [36]- [39]), smart environments (e.g. [40]) and wearable visual interfaces (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…Sato [9] uses pointing gesture to give command to service robot in virtual space. In her research, the pointing direction is estimated by using marker at user's cap and hand glove.…”
Section: A Related Workmentioning
confidence: 99%
“…[22]), and robots (e.g. [21,26,35,42,45]). Unfortunately, most of these systems require that the objects present in the scene were already detected, segmented, recognized, categorized and/or their attributes identified.…”
Section: Pointingmentioning
confidence: 99%