2017
DOI: 10.1007/s12369-017-0408-9
|View full text |Cite
|
Sign up to set email alerts
|

Foundations of Visual Linear Human–Robot Interaction via Pointing Gesture Navigation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0
3

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(34 citation statements)
references
References 25 publications
0
31
0
3
Order By: Relevance
“…Proximity interaction techniques can take advantage of pointing gestures to intuitively express locations or objects with minimal cognitive overhead; this modality has been often used in HRI research e.g. for pick‐and‐place tasks (Brooks & Breazeal, ; Cosgun, Trevor, & Christensen, ; Droeschel, Stückler, & Behnke, ; Großmann et al, ), labeling and/or querying information about objects or locations (Akkil & Isokoski, ; Brooks & Breazeal, ; Pateraki, Baltzakis, & Trahanias, ), selecting a robot within a group (Nagi, Giusti, Gambardella, & Di Caro, ; Pourmehr, Monajjemi, Wawerla, Vaughan, & Mori, ), and providing navigational goals (Raza Abidi, Williams, & Johnston, ; Gromov et al, , ; Jevtić, Doisy, Parmet, & Edan, ; Tölgyessy et al, ; Van den Bergh et al, ; Wolf et al, ). Such gestures can enable rescue workers to easily direct multiple robots, and robot types, using the same interface (see Figure ).…”
Section: State Of the Artmentioning
confidence: 99%
“…Proximity interaction techniques can take advantage of pointing gestures to intuitively express locations or objects with minimal cognitive overhead; this modality has been often used in HRI research e.g. for pick‐and‐place tasks (Brooks & Breazeal, ; Cosgun, Trevor, & Christensen, ; Droeschel, Stückler, & Behnke, ; Großmann et al, ), labeling and/or querying information about objects or locations (Akkil & Isokoski, ; Brooks & Breazeal, ; Pateraki, Baltzakis, & Trahanias, ), selecting a robot within a group (Nagi, Giusti, Gambardella, & Di Caro, ; Pourmehr, Monajjemi, Wawerla, Vaughan, & Mori, ), and providing navigational goals (Raza Abidi, Williams, & Johnston, ; Gromov et al, , ; Jevtić, Doisy, Parmet, & Edan, ; Tölgyessy et al, ; Van den Bergh et al, ; Wolf et al, ). Such gestures can enable rescue workers to easily direct multiple robots, and robot types, using the same interface (see Figure ).…”
Section: State Of the Artmentioning
confidence: 99%
“…The work of Tölgyessy [20] investigates the use of pointing gestures to give target locations to Roomba-like robots. It uses an RGBD sensor (Kinect) mounted on a 2-DoF robot to interpret the pointing gesture from a person standing 2-3 meters from the setup.…”
Section: Related Workmentioning
confidence: 99%
“…There are multiple studies that research the use of gestures to navigate a UAV [23,20,16,22]. However, we investigate the collaboration between UAVs and humans to relieve human task load and reduce UAV's error by processing periodical human-input.…”
Section: Related Workmentioning
confidence: 99%
“…They allow a person to intuitively and efficiently communicate locations and other spatial notions (trajectories, directions). Research in robotics has used pointing gestures for disparate tasks: pick-and-place [7,8,9], object and area labeling [10], teaching by demonstration [11], point-to-goal [12,13,14], selection of a robot within a group [15,16], and assessment of the joint attention [17]. Other works use non-pointing gestures for interacting with co-located robots, e.g.…”
Section: Related Workmentioning
confidence: 99%