2022
DOI: 10.1016/j.compbiomed.2022.106248
|View full text |Cite
|
Sign up to set email alerts
|

EEGDnet: Fusing non-local and local self-similarity for EEG signal denoising with transformer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…To better separate and recover the original shipradiated noise before distortion and mixing, an end-to-end nonlinear BSS network based on an attention mechanism is proposed in this paper. Due to the fact that the Transformer has a shortcoming in capturing local self-dependency and performs well in learning long-term or global dependencies, while convolutional neural networks (CNN) and RNN behave in the opposite way [52][53][54][55][56], an end-to-end network is utilized combining an RNN and multi-head self-attention, i.e., recurrent attention neural networks (RANN). The recurrent attention mechanism is used in image aesthetics, target detection, flow forecasting and time series forecasting [57][58][59][60][61], but it has not been used in nonlinear BSS yet.…”
Section: Introductionmentioning
confidence: 99%
“…To better separate and recover the original shipradiated noise before distortion and mixing, an end-to-end nonlinear BSS network based on an attention mechanism is proposed in this paper. Due to the fact that the Transformer has a shortcoming in capturing local self-dependency and performs well in learning long-term or global dependencies, while convolutional neural networks (CNN) and RNN behave in the opposite way [52][53][54][55][56], an end-to-end network is utilized combining an RNN and multi-head self-attention, i.e., recurrent attention neural networks (RANN). The recurrent attention mechanism is used in image aesthetics, target detection, flow forecasting and time series forecasting [57][58][59][60][61], but it has not been used in nonlinear BSS yet.…”
Section: Introductionmentioning
confidence: 99%
“…Liu et al [29] proposed a novel recurrent neural networkbased EEG state estimation model, which automatically replaces the detected noises with the estimated values to achieve a robust EEG perception. Pu et al [30] proposed a 1-D EEG signal denoising network EEGDNet with a 2D Transformer, which can significantly reduce the negative effects of noise and outliers by fusing the nonlocal self-similarity in the selfattention module and the local self-similarity in the feedforward module.…”
Section: Introductionmentioning
confidence: 99%
“…The Self-Attention (SA) structure and parallel computing mode allow the transfer extract global information without multiple convolutions and pooling calculations. In the BCI field, the transformer is adopted to handle signals in the applications such as person identification (Du et al, 2022 ), emotion recognition (Li et al, 2022 ), visual stimulus classification (Bagchi and Bathula, 2022 ) and signal denoising (Pu et al, 2022 ). For MI-EEG decoding, Ma et al (Ma et al, 2022 ) proposed a hybrid CNN-Transformer model to weigh spatial features and frequency signals by employing the attention mechanism.…”
Section: Introductionmentioning
confidence: 99%