2014 IEEE International Conference on Image Processing (ICIP) 2014
DOI: 10.1109/icip.2014.7025309
|View full text |Cite
|
Sign up to set email alerts
|

An efficient method for human pointing estimation for robot interaction

Abstract: In this paper, we propose an efficient calibration method to estimate the pointing direction via a human pointing gesture to facilitate robot interaction. The ways in which pointing gestures are used by humans to indicate an object are individually diverse. In addition, people do not always point at the object carefully, which means there is a divergence between the line from the eye to the tip of the index finger and the line of sight. Hence, we focus on adapting to these individual ways of pointing to improv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…In human-computer interaction (HCI) and HRI research we can identify two classes of methods for estimating pointing rays: head-rooted and arm-rooted. The head-rooted techniques consider a pointing ray that originates somewhere within the head: from a dominant eye ( [23]), cyclops eye ( [24]), or head centroid ( [25,26]). Arm-rooted methods assume the ray originates from a point laying on the pointing arm: at shoulder, elbow ( [25,7,27,28]), wrist ( [8]) or index-finger ( [27]).…”
Section: Related Workmentioning
confidence: 99%
“…In human-computer interaction (HCI) and HRI research we can identify two classes of methods for estimating pointing rays: head-rooted and arm-rooted. The head-rooted techniques consider a pointing ray that originates somewhere within the head: from a dominant eye ( [23]), cyclops eye ( [24]), or head centroid ( [25,26]). Arm-rooted methods assume the ray originates from a point laying on the pointing arm: at shoulder, elbow ( [25,7,27,28]), wrist ( [8]) or index-finger ( [27]).…”
Section: Related Workmentioning
confidence: 99%
“…3: System Flowchart goal points were assumed, and pointing gesture direction was only detected in the horizontal axis. [16], [17], [18], [19], [20] used the skeleton tracker provided by the Kinect NITE library which struggles with occluded body parts, it also performs poorly at very close distances. [21] proposed a probabilistic appearance-based model trained with images captured from different viewpoints which is independent of the user's body posture and do not require full-body or partial-body postures for detection, however it relies on hand and finger pose which are only available at close ranges since lots of hand-pixels are required.…”
Section: Related Workmentioning
confidence: 99%
“…There is research in which robots recognize the target pointed by a person and use it in daily life [Ueno et al, 2014]. A mathematical model of a gesture based pointing interface system has also been proposed to simulate pointing behavior in various situations [Kondo et al, 2018].…”
Section: Related Projectsmentioning
confidence: 99%