2013
DOI: 10.1016/j.medengphy.2012.12.005
|View full text |Cite
|
Sign up to set email alerts
|

Commanding a robotic wheelchair with a high-frequency steady-state visual evoked potential based brain–computer interface

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
111
1
2

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 137 publications
(114 citation statements)
references
References 25 publications
0
111
1
2
Order By: Relevance
“…[4] proposed a wheelchair robot controlled by EEG signal, and Diez etc. [5] designed the wheelchair robot based on high-frequency steady-state visual evoked potential brain-brain interface (BCI). These wheelchair robots can recognize the operator's intent by acquiring the EEG signal from the outer layer of the scalp of the wheelchair user and convert it to the control commands for a wheelchair.…”
Section: Introductionmentioning
confidence: 99%
“…[4] proposed a wheelchair robot controlled by EEG signal, and Diez etc. [5] designed the wheelchair robot based on high-frequency steady-state visual evoked potential brain-brain interface (BCI). These wheelchair robots can recognize the operator's intent by acquiring the EEG signal from the outer layer of the scalp of the wheelchair user and convert it to the control commands for a wheelchair.…”
Section: Introductionmentioning
confidence: 99%
“…To command electric-powered wheelchairs, a wide variety of approaches have been proposed, among: joysticks [5], EEG [6], EMG [7], hybrid EEG/EMG [8], and even a multi-modal interface, with flexibility to choose different modalities for communication (eye blinks, eye movements, head movements, by blowing or sucking a straw, and through brain signals), depending on the user's different level of disability [9]. Nonetheless, even with the advances on the state-of-the-art, assistive technologies struggle to become a handy tool.…”
Section: Introductionmentioning
confidence: 99%
“…Há na literatura uma grande quantidade de estudos que visa quantificar formas de expressar comandos em sistemas de interface humano computador usando sinais biológicos, como: eletroencefalografia -EEG [5], eletromiografia de superfície -EMG [6], rastreamento do olhar [7], interfaces híbridas EEG/EMG [8], entre outros [9], objetivando encontrar soluções para proporcionar um maior conforto para as pessoas debilitadas. Em 2016, Bissoli et al [10] propuseram uma solução que permite a interação de cadeirantes com eletrodomésticos residenciais através de uma interface capaz de utilizar a direção do olhar para controlar um cursor no display, selecionando objetos residenciais desejados e interagindo com eles através de uma caixa inteligente proposta por eles.…”
Section: Introductionunclassified