The present study proposes a hybrid brain-computer interface (BCI) with 64 selectable items based on the fusion of P300 and steady-state visually evoked potential (SSVEP) brain signals. With this approach, row/column (RC) P300 and two-step SSVEP paradigms were integrated to create two hybrid paradigms, which we denote as the double RC (DRC) and 4-D spellers. In each hybrid paradigm, the target is simultaneously detected based on both P300 and SSVEP potentials as measured by the electroencephalogram. We further proposed a maximum-probability estimation (MPE) fusion approach to combine the P300 and SSVEP on a score level and compared this approach to other approaches based on linear discriminant analysis, a naïve Bayes classifier, and support vector machines. The experimental results obtained from thirteen participants indicated that the 4-D hybrid paradigm outperformed the DRC paradigm and that the MPE fusion achieved higher accuracy compared with the other approaches. Importantly, 12 of the 13 participants, using the 4-D paradigm achieved an accuracy of over 90% and the average accuracy was 95.18%. These promising results suggest that the proposed hybrid BCI system could be used in the design of a high-performance BCI-based keyboard.
Most P300 event-related potential (ERP)-based brain-computer interface (BCI) studies focus on gaze shift-dependent BCIs, which cannot be used by people who have lost voluntary eye movement. However, the performance of visual saccade-independent P300 BCIs is generally poor. To improve saccade-independent BCI performance, we propose a bimodal P300 BCI approach that simultaneously employs auditory and tactile stimuli. The proposed P300 BCI is a vision-independent system because no visual interaction is required of the user. Specifically, we designed a direction-congruent bimodal paradigm by randomly and simultaneously presenting auditory and tactile stimuli from the same direction. Furthermore, the channels and number of trials were tailored to each user to improve online performance. With 12 participants, the average online information transfer rate (ITR) of the bimodal approach improved by 45.43% and 51.05% over that attained, respectively, with the auditory and tactile approaches individually. Importantly, the average online ITR of the bimodal approach, including the break time between selections, reached 10.77 bits/min. These findings suggest that the proposed bimodal system holds promise as a practical visual saccade-independent P300 BCI.
Brain-computer interface (BCI) spellers could improve access to communication for people with profound physical disabilities; however, improved speed and accuracy of these spellers is required to make them practical for everyday use. Here we introduce the combination of P300-speller confidence with the error-related potential (ErrP) to improve online single-trial error detection and correction accuracies in a BCI speller. First, we present a mechanism for obtaining P300-confidence using a real-time Bayesian dynamic stopping framework that makes novel use of additional stimuli that occur due to epoch and filter delays. Second, we propose an ensemble of decision trees to combine ErrP and P300-confidence features. Third, we describe the unique attentional differences between error and correct feedback in our spelling interface and discuss how these differences affect ErrP physiology. We tested online error detection on 11 typically developed adults using a BCI system trained on a previous day and found an average sensitivity of 86.67% and specificity of 96.59%. Automatic correction increased selection accuracy by 13.67% and utility grew by a factor of 4.48. We found, however, that the improved performance was primarily attributable to the inclusion of P300 confidence in error detection, calling into question the significance of single-trial ErrP detection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.