2017
DOI: 10.1177/1541931213601483
|View full text |Cite
|
Sign up to set email alerts
|

Natural Language Based Multimodal Interface for UAV Mission Planning

Abstract: As the number of viable applications for unmanned aerial vehicle (UAV) systems increases at an exponential rate, interfaces that reduce the reliance on highly skilled engineers and pilots must be developed. Recent work aims to make use of common human communication modalities such as speech and gesture. This paper explores a multimodal natural language interface that uses a combination of speech and gesture input modalities to build complex UAV flight paths by defining trajectory segment primitives. Gesture in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 15 publications
0
6
0
Order By: Relevance
“…Other alternative interfaces that rely on motion tracking devices, such as the leap motion controller, can detect gestures such as forearm supination, thereby providing hands-free steering wheel rotation (Akyol and Canzler, 2000; Chandarana et al , 2017). As in the case of joysticks, however, the sensor for the motion tracking device may be installed in a location that adds more distance between the hands of the driver and the steering wheel, in comparison to the sEMG-based interface.…”
Section: Design Of the Steering Assistance Systemmentioning
confidence: 99%
“…Other alternative interfaces that rely on motion tracking devices, such as the leap motion controller, can detect gestures such as forearm supination, thereby providing hands-free steering wheel rotation (Akyol and Canzler, 2000; Chandarana et al , 2017). As in the case of joysticks, however, the sensor for the motion tracking device may be installed in a location that adds more distance between the hands of the driver and the steering wheel, in comparison to the sEMG-based interface.…”
Section: Design Of the Steering Assistance Systemmentioning
confidence: 99%
“…Gesture-based drone interfaces have been actively researched (e.g., [20,52,58]) to make drone control more intuitive and fun. Voice-based control has also been introduced [8,9,48]. These interfaces are intuitive and allow segments of the drone movement paths to be input.…”
Section: Ar-based Egocentric Drone Interfacementioning
confidence: 99%
“…Chandarana et al [35] presented a custom-developed software using speech and gesture recognition for UAV path planning. In their research, the authors performed a comparison of natural language interfaces using a mouse-based interface as a baseline and evaluating individual users who were required to complete a flight path composed of flight trajectories or actions.…”
Section: Speech Controlmentioning
confidence: 99%