2023
DOI: 10.3389/fnins.2023.1148855
|View full text |Cite
|
Sign up to set email alerts
|

EEGformer: A transformer–based brain activity classification method using EEG signal

Abstract: BackgroundThe effective analysis methods for steady-state visual evoked potential (SSVEP) signals are critical in supporting an early diagnosis of glaucoma. Most efforts focused on adopting existing techniques to the SSVEPs-based brain–computer interface (BCI) task rather than proposing new ones specifically suited to the domain.MethodGiven that electroencephalogram (EEG) signals possess temporal, regional, and synchronous characteristics of brain activity, we proposed a transformer–based EEG analysis model kn… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 32 publications
(30 reference statements)
0
4
0
Order By: Relevance
“…Thirdly, the utilization of attention mechanisms enables the identification of influential connections, pinpointing nodes that play crucial roles in the network. Lastly, we adopt a multi-scale perspective by separately extracting features from two graph convolutional layers and concatenating them, providing a comprehensive representation of node features ( Wan et al, 2023b , c ). These combined strategies contribute to the enhanced performance of our algorithm.…”
Section: Resultsmentioning
confidence: 99%
“…Thirdly, the utilization of attention mechanisms enables the identification of influential connections, pinpointing nodes that play crucial roles in the network. Lastly, we adopt a multi-scale perspective by separately extracting features from two graph convolutional layers and concatenating them, providing a comprehensive representation of node features ( Wan et al, 2023b , c ). These combined strategies contribute to the enhanced performance of our algorithm.…”
Section: Resultsmentioning
confidence: 99%
“…In addition, some researches mapped EEG signals into feature maps using CNN architecture, implementing selfattention enhancement for latent features. For example, Wan et al [193] proposed a Transformer-based EEG analysis model (EEGformer) for brain activity classification. They deeply convolved EEG signals to obtain channel features and calculated vectors from feature maps along the temporal, regional, and synchronous dimensions.…”
Section: Eeg Processingmentioning
confidence: 99%
“…In addition, some researches mapped EEG signals into feature maps using CNN architecture, implementing self‐attention enhancement for latent features. For example, Wan et al [193] . proposed a Transformer‐based EEG analysis model (EEGformer) for brain activity classification.…”
Section: Transformers In Brain Sciencesmentioning
confidence: 99%
“…For instance, in emotion recognition, self-supervised learning models using transformers have excelled by capturing relevant features from ECG time-series signals, showcasing superior performance on emotion recognition tasks [53]. Similarly, for human activity recognition, lightweight transformer approaches have been applied successfully using IR-UWB radar [27] and EEG signals [55]. Even WiFi signals have been utilized for indoor human mobility modeling through transformer-based techniques [48].…”
Section: Learning Algorithms For Time Series Signal Based Recognitionmentioning
confidence: 99%