2018
DOI: 10.1073/pnas.1718648115
|View full text |Cite|
|
Sign up to set email alerts
|

Data-driven body–machine interface for the accurate control of drones

Abstract: SignificanceThe teleoperation of nonhumanoid robots is often a demanding task, as most current control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface, and may therefore be challenging to master. Here, we describe a structured methodology to identify common patterns in spontaneous interaction behaviors, to implement embodied user interfaces, and to select the appropriate sensor type and positioning. Using this m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
64
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 69 publications
(78 citation statements)
references
References 54 publications
1
64
0
Order By: Relevance
“…We investigated how users of a body machine interface learn to reorganize or “remap” their body motions as they practice controlling an external object through the BoMI. The controlled object could be a wheelchair, a robotic assistant, or a drone [15-17]. Here we focus on the control of a computer cursor whose two-dimensional coordinates determine its location on a computer screen.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…We investigated how users of a body machine interface learn to reorganize or “remap” their body motions as they practice controlling an external object through the BoMI. The controlled object could be a wheelchair, a robotic assistant, or a drone [15-17]. Here we focus on the control of a computer cursor whose two-dimensional coordinates determine its location on a computer screen.…”
Section: Resultsmentioning
confidence: 99%
“…A broad spectrum of sensors, such inertial measurement units (IMUs) placed on the head [37] or electroencephalography (EEG) systems [38], are available for detecting and decoding movement intentions. A body machine interface (BoMI) captures residual body motions by optical [17, 39, 40], accelerometric [15, 41], or electromyographic sensors [42], and maps the sensor signals onto commands for external devices such as powered wheelchairs [15] or drones [17], or onto computer inputs. At the other end of the spectrum, brain-machine interfaces decode motor intention from neural activity recorded in motor or premotor cortical areas [19, 20, 43].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This wearable suit tracks the torso orientation, and converts it into drone commands. The design of the exosuit and its ergonomics are suited for this flight style that has been identified has a natural and intuitive approach that naïve users adopt to fly fixed wing drones [5]. The user sits on a backless stool and bends his torso forward and backward in the sagittal plane, to control the pitch up and down maneuvers respectively.…”
Section: Flyjacket Hardwarementioning
confidence: 99%
“…In order to make drones more accessible to non-expert users and facilitate their direct control in demanding tasks such as inspection or rescue missions, several studies have investigated the use of gestures [3], [4]. In a previous study, the authors have identified an intuitive upper body movement pattern that naïve users exploited to fly a fixed wing drone [5]. This embodied flight style, which allows the user to directly control the pitch and roll of a drone using torso movements, reduces learning time and increases performance when compared to the use of a traditional remote controllers.…”
Section: Introductionmentioning
confidence: 99%