2022
DOI: 10.1109/tim.2022.3165280
|View full text |Cite
|
Sign up to set email alerts
|

Spatial-Temporal Feature Fusion Neural Network for EEG-Based Emotion Recognition

Abstract: The spatial correlations and the temporal contexts are indispensable in Electroencephalogram (EEG)-based emotion recognition. However, the learning of complex spatial correlations among several channels is a challenging problem. Besides, the temporal contexts learning is beneficial to emphasize the critical EEG frames because the subjects only reach the prospective emotion during part of stimuli. Hence, we propose a novel Spatial-Temporal Information Learning Network (STILN) to extract the discriminative featu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 36 publications
(9 citation statements)
references
References 51 publications
(35 reference statements)
0
8
0
Order By: Relevance
“…For example, Zhong et al [17] introduce a regularized graph neural network for EEG-based emotion recognition, incorporating the biological topology of the brain. Wang et al [18] propose a hybrid spatial-temporal feature fusion neural network that simultaneously leverages the extensive emotional information embedded in EEG signals across both temporal and spatial dimensions, aiming at more effectively encapsulating and representing subjects' emotional states. Ye et al [19] present the hierarchical dynamic graph convolutional network, designed to extract dynamic multilayer spatial information interlinking EEG signal channels, thereby enhancing the accuracy of emotion recognition.…”
Section: Eeg-based Multimodal Emotion Recognition Methodsmentioning
confidence: 99%
“…For example, Zhong et al [17] introduce a regularized graph neural network for EEG-based emotion recognition, incorporating the biological topology of the brain. Wang et al [18] propose a hybrid spatial-temporal feature fusion neural network that simultaneously leverages the extensive emotional information embedded in EEG signals across both temporal and spatial dimensions, aiming at more effectively encapsulating and representing subjects' emotional states. Ye et al [19] present the hierarchical dynamic graph convolutional network, designed to extract dynamic multilayer spatial information interlinking EEG signal channels, thereby enhancing the accuracy of emotion recognition.…”
Section: Eeg-based Multimodal Emotion Recognition Methodsmentioning
confidence: 99%
“…Capturing spatial correlation among electrodes could provide discriminative information to recognize emotional states. Wang et al (2022b) transformed EEG signals into 2D topographic maps and adopted a convolutional neural network (CNN) to learn the spatial dependencies. Although this method enables EEG spatial learning, the high dimensional topographic maps obtained by the interpolation method could potentially cause feature redundancy.…”
Section: Related Work 21 Spatial Learning Of Eeg Signals For Emotion ...mentioning
confidence: 99%
“…The transformer-based model emerges as a power-ful approach, capable of learning discriminative spatial information extending from the electrode level to the brain-region level, in the pursuit of improving the ability to capture EEG spatial dependencies and enhance the accuracy of emotion recognition. [22,23].…”
Section: Exploring Past Work: a Brief Literature Reviewmentioning
confidence: 99%