2013
DOI: 10.1117/12.2015956
|View full text |Cite
|
Sign up to set email alerts
|

Visual and tactile interfaces for bi-directional human robot communication

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(12 citation statements)
references
References 0 publications
0
12
0
Order By: Relevance
“…First, the statistical classifier from Barber et al 18 Gestures fist markers and classified them. This statistical classifier was trained on 120 samples for each gesture from four researchers (30 samples each) collected prior to experimental data collection.…”
Section: Gesture Recognitionmentioning
confidence: 99%
See 2 more Smart Citations
“…First, the statistical classifier from Barber et al 18 Gestures fist markers and classified them. This statistical classifier was trained on 120 samples for each gesture from four researchers (30 samples each) collected prior to experimental data collection.…”
Section: Gesture Recognitionmentioning
confidence: 99%
“…To record and classify gesture commands from participants, a wireless IMU gesture glove used in previous experiments at UCF was used 18 The robot used for performing ISR missions was the Joint Reconnaissance and Mobility Robot (JRMBot). The JRMBot, Figure 4, is a small mobile platform containing various sensors for autonomous navigation including a Microsoft Kinect camera, laser range finder, and simulated compass/GPS via a Polhemus G4 six-degree-of-freedom tracking system.…”
Section: Equipmentmentioning
confidence: 99%
See 1 more Smart Citation
“…Alternatively, some attention has been paid to established languages such as military gestures [12,14]. Worn sensors like data gloves have become popular, including use for control of robotic vehicles [15], because they do not require holding a sensor in the hand. The closest work to SID is perhaps the WITAS project [16], which focused on multi-modal dialogue for UAV operations.…”
Section: Related Workmentioning
confidence: 99%
“…The ability of these technologies to engage natural modes of interaction through which information is conveyed, such as gesture and speech, have enabled efficient human-robot teaming 1,2 . The notion of intuitive and natural communication with robots has been a motivating factor in a variety of domains, specifically military 3 .…”
Section: Introductionmentioning
confidence: 99%