SignificanceThe teleoperation of nonhumanoid robots is often a demanding task, as most current control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface, and may therefore be challenging to master. Here, we describe a structured methodology to identify common patterns in spontaneous interaction behaviors, to implement embodied user interfaces, and to select the appropriate sensor type and positioning. Using this method, we developed an intuitive, gesture-based control interface for real and simulated drones, which outperformed a standard joystick in terms of learning time and steering abilities. Implementing this procedure to identify body-machine patterns for specific applications could support the development of more intuitive and effective interfaces.
Most human-drone interfaces, such as joysticks and remote controllers, require attention and developed skills during teleoperation. Wearable interfaces could enable a more natural and intuitive control of drones, which would make this technology accessible to a larger population of users. In this letter, we describe a soft exoskeleton, so called FlyJacket, designed for naïve users that want to control a drone with upper body gestures in an intuitive manner. The exoskeleton includes a motion-tracking device to monitor body movements, an arm support system to prevent fatigue, and is coupled to goggles for first-person-view from the drone perspective. Tests were performed with participants flying a simulated fixed-wing drone moving at a constant speed; participants' performance was more consistent when using the FlyJacket with the arm support than when performing the same task with a remote controller. Furthermore, participants felt more immersed, had more sensation of flying, and reported less fatigue when the arm support was enabled. The FlyJacket has been demonstrated for the teleoperation of a real drone. Index Terms-Human-robot interaction, telerobotics and teleoperation, virtual reality and interfaces, wearable robots.
Most human-robot interfaces, such as joysticks and keyboards, require training and constant cognitive effort and provide a limited degree of awareness of the robots' state and its environment. Embodied interactions, that is the bidirectional link between the physical bodies and control systems of the robot and of the human, could not only enable a more intuitive control of robots, even for novices, but also provide users with more immersive sensations. But providing an embodied interaction by mapping human movements into a non-anthropomorphic robot is particularly challenging. In this paper, we describe a natural and immersive embodied interaction that allows users to control and experience drone flight with their own bodies. The setup uses a commercial flight simulator that tracks hand movements and provides haptic and visual feedback. The paper discusses how to map body movement with drone motion, and how the resulting embodied interaction provides a more natural and immersive flight experience to unskilled users with respect to a conventional RC remote controller.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.