2018 IEEE International Conference on Robotics and Biomimetics (ROBIO) 2018
DOI: 10.1109/robio.2018.8664724
|View full text |Cite
|
Sign up to set email alerts
|

A Real-Time Brain Control Method Based on Facial Expression for Prosthesis Operation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
11
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
2
2
2

Relationship

4
2

Authors

Journals

citations
Cited by 6 publications
(12 citation statements)
references
References 17 publications
1
11
0
Order By: Relevance
“…As for the location of the EEG generator, since the precise positioning cannot be achieved without the extra professional equipment, the dipole was set in accordance with the mechanism. In our serial research on ME-BCI (Zhang et al, 2016(Zhang et al, , 2021bLi et al, 2018b;Lu et al, 2018aLu et al, ,b, 2020, data-driven brain connectivity analysis demonstrated the main involvement of the motor cortex (Lu et al, 2018a;Zhang et al, 2021b), which conformed to the contralateral control facts. Meanwhile, evidence showed the frontal lobe and limbic system also participate in facial-expression processing (Price and Drevets, 2010;Li et al, 2018b;Lu et al, 2018b).…”
Section: The Head Models and The Dipole Positionsupporting
confidence: 63%
See 1 more Smart Citation
“…As for the location of the EEG generator, since the precise positioning cannot be achieved without the extra professional equipment, the dipole was set in accordance with the mechanism. In our serial research on ME-BCI (Zhang et al, 2016(Zhang et al, , 2021bLi et al, 2018b;Lu et al, 2018aLu et al, ,b, 2020, data-driven brain connectivity analysis demonstrated the main involvement of the motor cortex (Lu et al, 2018a;Zhang et al, 2021b), which conformed to the contralateral control facts. Meanwhile, evidence showed the frontal lobe and limbic system also participate in facial-expression processing (Price and Drevets, 2010;Li et al, 2018b;Lu et al, 2018b).…”
Section: The Head Models and The Dipole Positionsupporting
confidence: 63%
“…According to our previous experimental setup in the BCI serial studies ( Lu et al, 2018a , 2020 ; Zhang et al, 2021b ), to realize the real-time decoding, continuous collected EEGs (4 s) were sliced into short-windowed (100 ms) to form the training-set. To reproduce the dataset forming procedure and augment the real EEGs samples, the output of the Generator was sliced to window-length size before entering the Discriminator.…”
Section: Methodsmentioning
confidence: 99%
“…Considering of the low frequency solution resulted from the short time window, feature engineering was realized by a classic spatial filtering method: Common Spatial Pattern (CSP) [21]. The CSP features were calculated as:…”
Section: Step B: Interface 'On' Detectionmentioning
confidence: 99%
“…Compared to the former FE-BCI [21], by limiting the range of facial muscle movement to the micro level, this microFE-BCI greatly reduces the proportion of EMG artifacts and makes it possible to distinguish between daily facial-expressions and microexpressions in EEG-based control. The contrast of EMG artifacts proportion is particularly obvious in the high frequency of EEG.…”
Section: Offline Assessment  Step A: Obvious Non-microfe-eegs Exclusionmentioning
confidence: 99%
“…Moreover, it needs to satisfy the basic requirements of real-time, precision, user-friendliness, and easiness, similar to the traditional control approaches. An EEG-based control paradigm assisted by facial-expression (FE-BCI) proposed by our research group [21][22][23][24][25] provides the capability for real-time decoding and control (each output generated from the latest 100 ms EEGs) [21], and has the characteristics of no additional user adaption, no stimulators, and no nerve adaptability [23].…”
Section: Introductionmentioning
confidence: 99%