This work presents a supervised machine-learning approach to build an expert system that provides support to the neuroscientist in automatically classifying ERP data and matching them with a multisensorial alphabet of stimuli. To do this, two different approaches are considered: a hierarchical tree-based algorithm, XGBoost, and feedfoward neural networks, highlighting the pros and cons of both approaches in the different steps of the classification task. Moreover, the sensitivity of the classification capabilities of the tool as a function of the number of available electrodes is also studied, highlighting what can be achieved by applying the method using commercial, wearable EEG systems. The main novelty of this work consists in significantly enlarging the pool of stimuli that the expert system can recognize and comprising different, possibly mixed, sensorial domains. The obtained results open the way to the design of portable devices for augmented communication systems, which can be of particular interest for the development of advanced Brain-Computer Interfaces (BCI) for communication with different types of neurologically impaired patients.
ObjectiveA majority of BCI systems, enabling communication with patients with locked-in syndrome, are based on electroencephalogram (EEG) frequency analysis (e.g., linked to motor imagery) or P300 detection. Only recently, the use of event-related brain potentials (ERPs) has received much attention, especially for face or music recognition, but neuro-engineering research into this new approach has not been carried out yet. The aim of this study was to provide a variety of reliable ERP markers of visual and auditory perception for the development of new and more complex mind-reading systems for reconstructing the mental content from brain activity.MethodsA total of 30 participants were shown 280 color pictures (adult, infant, and animal faces; human bodies; written words; checkerboards; and objects) and 120 auditory files (speech, music, and affective vocalizations). This paradigm did not involve target selection to avoid artifactual waves linked to decision-making and response preparation (e.g., P300 and motor potentials), masking the neural signature of semantic representation. Overall, 12,000 ERP waveforms × 126 electrode channels (1 million 512,000 ERP waveforms) were processed and artifact-rejected.ResultsClear and distinct category-dependent markers of perceptual and cognitive processing were identified through statistical analyses, some of which were novel to the literature. Results are discussed from the view of current knowledge of ERP functional properties and with respect to machine learning classification methods previously applied to similar data.ConclusionThe data showed a high level of accuracy (p ≤ 0.01) in the discriminating the perceptual categories eliciting the various electrical potentials by statistical analyses. Therefore, the ERP markers identified in this study could be significant tools for optimizing BCI systems [pattern recognition or artificial intelligence (AI) algorithms] applied to EEG/ERP signals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.