2021
DOI: 10.48550/arxiv.2110.09966
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SleepPriorCL: Contrastive Representation Learning with Prior Knowledge-based Positive Mining and Adaptive Temperature for Sleep Staging

Abstract: The objective of this paper is to learn semantic representations for sleep stage classification from raw physiological time series. Although supervised methods have gained remarkable performance, they are limited in clinical situations due to the requirement of fully labeled data. Self-supervised learning (SSL) based on contrasting semantically similar (positive) and dissimilar (negative) pairs of samples have achieved promising success. However, existing SSL methods suffer the problem that many semantically s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…In [172], a contrastive representation learning model named SleepPriorCL based on prior knowledge is proposed. The goal of this model is to learn a meaningful semantic representation for similar samples in self-supervised manner, without using any labels.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…In [172], a contrastive representation learning model named SleepPriorCL based on prior knowledge is proposed. The goal of this model is to learn a meaningful semantic representation for similar samples in self-supervised manner, without using any labels.…”
Section: Contrastive Learningmentioning
confidence: 99%