As a physiological process and high-level cognitive behavior, emotion is an important subarea in neuroscience research. Emotion recognition across subjects based on brain signals has attracted much attention. Due to individual differences across subjects and the low signal-to-noise ratio of EEG signals, the performance of conventional emotion recognition methods is relatively poor. In this paper, we propose a self-organized graph neural network (SOGNN) for cross-subject EEG emotion recognition. Unlike the previous studies based on pre-constructed and fixed graph structure, the graph structure of SOGNN are dynamically constructed by self-organized module for each signal. To evaluate the cross-subject EEG emotion recognition performance of our model, leave-one-subject-out experiments are conducted on two public emotion recognition datasets, SEED and SEED-IV. The SOGNN is able to achieve state-of-the-art emotion recognition performance. Moreover, we investigated the performance variances of the models with different graph construction techniques or features in different frequency bands. Furthermore, we visualized the graph structure learned by the proposed model and found that part of the structure coincided with previous neuroscience research. The experiments demonstrated the effectiveness of the proposed model for cross-subject EEG emotion recognition.
Detecting and Please provide the correct one analyzing the event-related potential (ERP) remains an important problem in neuroscience. Due to the low signal-to-noise ratio and complex spatio-temporal patterns of ERP signals, conventional methods usually rely on ensemble averaging technique for reliable detection, which may obliterate subtle but important information in each trial of ERP signals. Inspired by deep learning methods, we propose a novel hybrid network termed ERP-NET. With hybrid deep structure, the proposed network is able to learn complex spatial and temporal patterns from single-trial ERP signals. To verify the effectiveness of ERP-NET, we carried out a few ERP detection experiments that the proposed model achieved cutting-edge performance. The experimental results demonstrate that the patterns learned by the ERP-NET are discriminative ERP components in which the ERP signals are properly characterized. More importantly, as an effective approach to single-trial analysis, ERP-NET is able to discover new ERP patterns which are significant to neuroscience study as well as BCI applications. Therefore, the proposed ERP-NET is a promising tool for the research on ERP signals.
With the continuous development of portable noninvasive human sensor technologies such as brain–computer interfaces (BCI), multimodal emotion recognition has attracted increasing attention in the area of affective computing. This paper primarily discusses the progress of research into multimodal emotion recognition based on BCI and reviews three types of multimodal affective BCI (aBCI): aBCI based on a combination of behavior and brain signals, aBCI based on various hybrid neurophysiology modalities and aBCI based on heterogeneous sensory stimuli. For each type of aBCI, we further review several representative multimodal aBCI systems, including their design principles, paradigms, algorithms, experimental results and corresponding advantages. Finally, we identify several important issues and research directions for multimodal emotion recognition based on BCI.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.