ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9414752
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised Learning for Sleep Stage Classification with Predictive and Discriminative Contrastive Coding

Abstract: The purpose of this paper is to learn efficient representations from raw electroencephalogram (EEG) signals for sleep stage classification via self-supervised learning (SSL). Although supervised methods have gained favorable performance, they heavily rely on manually labeled datasets. Recently, SSL arrives comparable performance with fully supervised methods despite limited labeled data by extracting high-level semantic representations. To alleviate the severe reliance of labels, we propose SleepDPC, a novel s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(22 citation statements)
references
References 15 publications
0
18
0
Order By: Relevance
“…• T-Loss (Franceschi, Dieuleveut, and Jaggi 2019): Unsupervised scalable representation learning for multivariate time series. SleepDPC (Xiao al. 2021): Self-supervised learning for sleep staging by predicting future representations and distinguishing epochs from different epoch sequences.…”
Section: Comparison With Ssl Baselinesmentioning
confidence: 99%
See 3 more Smart Citations
“…• T-Loss (Franceschi, Dieuleveut, and Jaggi 2019): Unsupervised scalable representation learning for multivariate time series. SleepDPC (Xiao al. 2021): Self-supervised learning for sleep staging by predicting future representations and distinguishing epochs from different epoch sequences.…”
Section: Comparison With Ssl Baselinesmentioning
confidence: 99%
“…However, both of the two paradigms require fully labelled datasets, which are laborious to acquire in health care area. Recent progress of self-supervised learning (SSL) (Jing and Tian 2019;He et al 2020;Chen et al 2020) has gained promising performance for physiological time series, with competitive performance compared with supervised methods (Franceschi, Dieuleveut, and Jaggi 2019;Xiao et al 2021). SSL methods can extract representations with semantic information from raw physiological signals, which are promising to alleviate the burden of manual labeling works.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Thus, learning generic representations for neurophysiological signals via self-supervised training remains challenging. Recently, inspired by the success of contrastive learning, some researchers [15,45] proposed contrastive based pretraining methods on neurophysiological signals such as electroencephalogram (EEG). The success of contrastive learning relies on two essential assumptions: (i) augmented views of the same data sample should be semantically consistent in the latent representation space; (ii) strong augmentations are required to learn useful representation.…”
Section: Introductionmentioning
confidence: 99%