Proceedings of the 22nd International Conference on Intelligent User Interfaces 2017
DOI: 10.1145/3025171.3025179
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive View Management for Drone Teleoperation in Complex 3D Structures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 9 publications
0
9
0
Order By: Relevance
“…As an alternative to UAV navigation from egocentric views, direct commands can be issued in an adaptive exocentric perspective (Saakes et al, 2013;Thomason et al, 2017Thomason et al, , 2019 or from a 3D map view (Materna et al, 2017). The exocentric view can improve the operator's understanding of the environment and further increase safety and task performance.…”
Section: Graphical User Interfacesmentioning
confidence: 99%
See 1 more Smart Citation
“…As an alternative to UAV navigation from egocentric views, direct commands can be issued in an adaptive exocentric perspective (Saakes et al, 2013;Thomason et al, 2017Thomason et al, , 2019 or from a 3D map view (Materna et al, 2017). The exocentric view can improve the operator's understanding of the environment and further increase safety and task performance.…”
Section: Graphical User Interfacesmentioning
confidence: 99%
“…This raises the interesting question, if the performance of our teleoperation system is also preserved when put into practice. Compared to a variety of related teleoperation systems with similar mission complexity (Cho et al, 2017 ; Riestock et al, 2017b ; Thomason et al, 2017 ), we evaluate the performance of our system with a user study under real-world constraints (section 6.1).…”
Section: Related Workmentioning
confidence: 99%
“…The robotic visual assistant selected a view to keep all the objects of interests (e.g., manipulated objects, robot, obstacles) in its field of view while reasoning about the composition of the objects in the view. The second was based on the geometry of the environment, task, or robot [31,32,33,34,35,36,20,104,105,106]. The robotic visual assistant selected a view that was not occluded by the environment or provided additional geometrical information about the relation of the environment to the primary robot or task.…”
Section: Deliberative Autonomous Visual Assistantsmentioning
confidence: 99%
“…Other flight interfaces address the problem of collisionfree navigation by introducing exocentric views on the UAV and the surrounding scene. For example, Thomasson et al [8] enable a remote operator to navigate a UAV inside complex 3D structures. They combine a virtual exocentric view onto the UAV and further add an adaptive view method to improve SA and to avoid collisions.…”
Section: Related Workmentioning
confidence: 99%