2022
DOI: 10.1371/journal.pone.0269500
|View full text |Cite
|
Sign up to set email alerts
|

Multi-scale ResNet and BiGRU automatic sleep staging based on attention mechanism

Abstract: Sleep staging is the basis of sleep evaluation and a key step in the diagnosis of sleep-related diseases. Despite being useful, the existing sleep staging methods have several disadvantages, such as relying on artificial feature extraction, failing to recognize temporal sequence patterns in the long-term associated data, and reaching the accuracy upper limit of sleep staging. Hence, this paper proposes an automatic Electroencephalogram (EEG) sleep signal staging model, which based on Multi-scale Attention Resi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 35 publications
0
3
0
Order By: Relevance
“…The ResNet model effectively mitigates degradation and gradient disappearance during model training by stacking the residual structures [ 33 ]. Networks with fewer layers lack certain feature representation capabilities.…”
Section: Power Fingerprint Identification Based On Transferred Cbam-r...mentioning
confidence: 99%
“…The ResNet model effectively mitigates degradation and gradient disappearance during model training by stacking the residual structures [ 33 ]. Networks with fewer layers lack certain feature representation capabilities.…”
Section: Power Fingerprint Identification Based On Transferred Cbam-r...mentioning
confidence: 99%
“…Enhanced variants of the RNN model include gated recurrent unit (GRU) [29], long short-term memory (LSTM), and bidirectional LSTM (BiLSTM) [30]. The GRU model is an enhanced version of the LSTM and it has the advantages of extracting and learning long-term correlation between features sequentially in only one direction [31]. GRU models have fewer parameters than LSTM to minimize the computation cost.…”
Section: Introductionmentioning
confidence: 99%
“…AttnSleep [17], developed by Eldele et al, is an attention-based deep learning architecture that leverages multi-resolution CNNs to extract low-and high-frequency features of EEG signals, while employs an adaptive feature recalibration technique to improve the quality of the extracted features by modeling the inter-dependencies between them, and uses a multi-head attention mechanism to capture the temporal relationships among sleep stages. In another study, Liu et al [18] employed parallel residual neural networks with improved channel and spatial feature attention units for multi-scale feature extraction of EEG sleep signals, while a bi-directional gated recurrent unit (Bi-GRU) was utilized to determine the dependence between sleep stages. Among the above-mentioned deep learning models, multi-layer architectures of onedimensional convolutional neural networks (1D-CNNs) have been shown to be effective in extracting informative features from EEG signals for sleep staging.…”
Section: Introductionmentioning
confidence: 99%