2023
DOI: 10.1109/tbme.2023.3252889
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised Learning for Annotation Efficient Biomedical Image Segmentation

Abstract: The scarcity of high-quality annotated data is omnipresent in machine learning. Especially in biomedical segmentation applications, experts need to spend a lot of their time into annotating due to the complexity. Hence, methods to reduce such efforts are desired. Methods: Self-Supervised Learning (SSL) is an emerging field that increases performance when unannotated data is present. However, profound studies regarding segmentation tasks and small datasets are still absent. A comprehensive qualitative and quant… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 57 publications
0
1
0
Order By: Relevance
“…Deep learning is well-known and has been used in previous biomedical studies (Kha et al, 2022 ; Yuan et al, 2023 ). For example, SSL-based segmentation applications on biomedical microscopic images were highly adopted (Shurrab and Duwairi, 2022 ; Rettenberger et al, 2023 ; Sánchez et al, 2023 ). However, gathering sufficient volumes of annotated microscopy images is a tedious and time-consuming task that requires considerable domain expertise.…”
Section: Related Workmentioning
confidence: 99%
“…Deep learning is well-known and has been used in previous biomedical studies (Kha et al, 2022 ; Yuan et al, 2023 ). For example, SSL-based segmentation applications on biomedical microscopic images were highly adopted (Shurrab and Duwairi, 2022 ; Rettenberger et al, 2023 ; Sánchez et al, 2023 ). However, gathering sufficient volumes of annotated microscopy images is a tedious and time-consuming task that requires considerable domain expertise.…”
Section: Related Workmentioning
confidence: 99%