2021
DOI: 10.1038/s41746-021-00510-8
|View full text |Cite
|
Sign up to set email alerts
|

A deep transfer learning approach for wearable sleep stage classification with photoplethysmography

Abstract: Unobtrusive home sleep monitoring using wrist-worn wearable photoplethysmography (PPG) could open the way for better sleep disorder screening and health monitoring. However, PPG is rarely included in large sleep studies with gold-standard sleep annotation from polysomnography. Therefore, training data-intensive state-of-the-art deep neural networks is challenging. In this work a deep recurrent neural network is first trained using a large sleep data set with electrocardiogram (ECG) data (292 participants, 584 … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
46
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 63 publications
(48 citation statements)
references
References 51 publications
(72 reference statements)
0
46
1
Order By: Relevance
“…On the held out test set of 204 MESA patients, SleepPPG-Net scored a κ of 0.75 against 0.66 for BM-FE and 0.69 for BM-DTS approaches. SleepPPG-Net performance is also significantly (p < 0.001, two-sample t-test) higher than the current published SOTA results for sleep staging from PPG which stand at a κ of 0.66 [22,23], and significantly (p = 0.02, two-sample t-test) higher than the current SOTA results for sleepsleep staging from ECG which are reported at κ of 0.69 [20]. Figure 9 presents an example of the hypnograms generated by BM-FE, DB-DTS and SleepPPG-Net for a single patient.…”
Section: Discussioncontrasting
confidence: 78%
See 1 more Smart Citation
“…On the held out test set of 204 MESA patients, SleepPPG-Net scored a κ of 0.75 against 0.66 for BM-FE and 0.69 for BM-DTS approaches. SleepPPG-Net performance is also significantly (p < 0.001, two-sample t-test) higher than the current published SOTA results for sleep staging from PPG which stand at a κ of 0.66 [22,23], and significantly (p = 0.02, two-sample t-test) higher than the current SOTA results for sleepsleep staging from ECG which are reported at κ of 0.69 [20]. Figure 9 presents an example of the hypnograms generated by BM-FE, DB-DTS and SleepPPG-Net for a single patient.…”
Section: Discussioncontrasting
confidence: 78%
“…Most works that use PPG usually do so in the context of transfer learning (TL), where models are trained on a large database of heart rate variability (HRV) measures and then fine-tuned to a smaller database of pulse rate variability (PRV) measures derived from the IBIs detected on the PPG. These works report κ performance approaching 0.66 [22,23]. Sleep staging from the raw PPG is a relatively novel approach.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, Yang et al , [12] and Kouchaki et al ., [13] used different machine learning algorithms, namely support vector machine (SVM), logistic regression (LR), and random forest (RF) to predict AMR from whole-genome sequencing data and achieved high accuracy prediction. Other approaches also included deep learning to predict new antibiotic drugs, AMR genes, and AMR peptides [14] , [15] , [16] , [17] , [18] , [19] , [20] . However, all of these studies are based on single drug resistance information and do not take into account the MDR information of the bacteria.…”
Section: Introductionmentioning
confidence: 99%
“…The percentage and destitution of each sleep stage class are then used to evaluate sleep quality. While the simple identification between wake, REM sleep, and NREM sleep can be achieved using ECG [6], photoplethysmography [7], oximetry [8] or body movement [9], EEG or EOG signals are still required to correctly distinguish the different sleep stages. The need for multiple channels has kept the measurement of sleep quality from mass deployment outside of dedicated sleep laboratories.…”
Section: Introductionmentioning
confidence: 99%