2020
DOI: 10.1038/s42256-020-00247-1
|View full text |Cite
|
Sign up to set email alerts
|

Self-supervised retinal thickness prediction enables deep learning from unlabelled data to boost classification of diabetic retinopathy

Abstract: Access to large, annotated samples represents a considerable challenge for training accurate deep-learning models in medical imaging. While current leading-edge transfer learning from pre-trained models can help with cases lacking data, it limits design choices, and generally results in the use of unnecessarily large models. We propose a novel, self-supervised training scheme for obtaining high-quality, pre-trained networks from unlabeled, cross-modal medical imaging data, which will allow for creating accurat… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
39
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 66 publications
(44 citation statements)
references
References 39 publications
2
39
0
2
Order By: Relevance
“…After close inspection of all combinations (Figure 2 only shows the top-5 combination), we observed that SimCLR showed to be more effective than ImageNet in combination with supervised learning. This observation is in alignment with recent papers that show that SimCLR or other self-supervised methods outperform ImageNet on biomedical applications 34,35,36 . However, our analysis showed that ImageNet and SimCLR pre-training performed comparatively similar when being combined with a semi-supervised method.…”
Section: Discussionsupporting
confidence: 90%
“…After close inspection of all combinations (Figure 2 only shows the top-5 combination), we observed that SimCLR showed to be more effective than ImageNet in combination with supervised learning. This observation is in alignment with recent papers that show that SimCLR or other self-supervised methods outperform ImageNet on biomedical applications 34,35,36 . However, our analysis showed that ImageNet and SimCLR pre-training performed comparatively similar when being combined with a semi-supervised method.…”
Section: Discussionsupporting
confidence: 90%
“…These research gaps are the primary motivations of the proposal in this article. Firstly, all the cited works uses the supervised learning (SL) scheme to train the models which is known to require massive amount of data to accomplish 39 . Even in the scenarios where adequate data is available the training time for DL models typically takes many hours or days to complete 32 .…”
Section: Introductionmentioning
confidence: 99%
“…Transfer learning has emerged as a machine learning paradigm for such scenarios (Pan and Yang 2010;Neyshabur, Sedghi, and Zhang 2020), where we have access to different datasets from multiple resources (known as source domains) and want to make predictions for a dataset of interest (known as target domain) and it has been employed in different problems (Taroni et al 2019;Raghu et al 2019;Holmberg et al 2020;Hu et al 2020). Various methods of transfer learning have been proposed in the context of drug response prediction.…”
Section: Introductionmentioning
confidence: 99%