2023
DOI: 10.1109/access.2023.3344531
|View full text |Cite
|
Sign up to set email alerts
|

Applications of Self-Supervised Learning to Biomedical Signals: A Survey

Federico Del Pup,
Manfredo Atzori

Abstract: Over the last decade, deep learning applications in biomedical research have exploded, demonstrating their ability to often outperform previous machine learning approaches in various tasks. However, training deep learning models for biomedical applications requires large amounts of data annotated by experts, whose collection is often time-and cost-prohibitive. Self-Supervised Learning (SSL) has emerged as a prominent solution for such problem, as it allows to learn powerful representations from vast unlabeled … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 185 publications
0
1
0
Order By: Relevance
“…Del Pup conducted a comprehensive survey on the applications of SSL to biomedical signals, highlighting its potential for extracting meaningful representations from diverse biomedical data sources [46]. This is particularly relevant considering the inherent challenge faced by both the biomedical and cybersecurity domains in acquiring labeled data at scale, owing to factors such as privacy concerns, data scarcity, and the need for domain expertise.…”
Section: B Self-supervised Learningmentioning
confidence: 99%
“…Del Pup conducted a comprehensive survey on the applications of SSL to biomedical signals, highlighting its potential for extracting meaningful representations from diverse biomedical data sources [46]. This is particularly relevant considering the inherent challenge faced by both the biomedical and cybersecurity domains in acquiring labeled data at scale, owing to factors such as privacy concerns, data scarcity, and the need for domain expertise.…”
Section: B Self-supervised Learningmentioning
confidence: 99%
“…Expectationmaximization (EM) and multiple instance learning jointly infer labels and recognize discriminative regions, enabling training from cheaper forms of weak supervision. G. Self-supervised/unsupervised learning Self-supervised learning has also gained vast attention in CV by enabling pre-training from sheer ubiquity of unlabeled visual data [384]- [393]. Pretext tasks like predicting image rotations, solving jigsaw puzzles, or counting pixel colors allow models to learn rich visual representations applicable to downstream tasks.…”
Section: F Weakly-supervised Learningmentioning
confidence: 99%