2014
DOI: 10.1016/j.compbiomed.2014.04.020
|View full text |Cite
|
Sign up to set email alerts
|

Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
46
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 101 publications
(55 citation statements)
references
References 26 publications
0
46
0
Order By: Relevance
“…By combining different biosignal sensors such as EEG, EMG, and eye tracking, researchers have demonstrated promising success controlling prosthetic limbs and even quadrotor flight [42,25,23,32]. This paper focuses on safetycritical tasks where robot operation must be corrected by a human supervisor with low latency.…”
Section: Hybrid Control Methods For Human-robot Interactionmentioning
confidence: 99%
“…By combining different biosignal sensors such as EEG, EMG, and eye tracking, researchers have demonstrated promising success controlling prosthetic limbs and even quadrotor flight [42,25,23,32]. This paper focuses on safetycritical tasks where robot operation must be corrected by a human supervisor with low latency.…”
Section: Hybrid Control Methods For Human-robot Interactionmentioning
confidence: 99%
“…Finally, Conati and Merten (2007) discuss work they have done on using pupil dilation information, also gathered through eye-tracking data, to further improve model accuracy. Kim et al (2014) propose a wearable hybrid interface where eye movements and mental concentration directly influence the control of a quadcopter in three-dimensional space. This noninvasive and low-cost interface addresses limitations of previous work by supporting users to complete their complicated tasks in a constrained environment in which only visual feedback is provided.…”
Section: Intelligent User Interfacementioning
confidence: 99%
“…Based on the visual feedback, the subjects used the interface to navigate along pre-set target locations in the air. The flight performance was evaluated by comparing with a keyboard-based interface (Kim et al 2014).…”
Section: Intelligent User Interfacementioning
confidence: 99%
“…In [19,20], the multimodality is used to improve the behavior of an EOG interface (in combination with EEG signals) or a BCI (combined with MEG signals) respectively. EEG was also combined with eye tracking to control a quadcopter [21].…”
Section: Introductionmentioning
confidence: 99%