2022
DOI: 10.48550/arxiv.2207.11191
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Self-Supervised-RCNN for Medical Image Segmentation with Limited Data Annotation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…SSL has been used for medical imaging datasets including US [14] and histology images [15]. Felfeliyan et al [16] applied different distortions to arbitrary areas of unlabeled data and used the improved Mask R-CNN models to predict distortion type and loss information. He et al [17] proposed Masked Autoencoders (SSL-MAE) for ViT backbone models self-supervised pretraining.…”
Section: Self-supervised Learning and Masked Autoencodermentioning
confidence: 99%
See 1 more Smart Citation
“…SSL has been used for medical imaging datasets including US [14] and histology images [15]. Felfeliyan et al [16] applied different distortions to arbitrary areas of unlabeled data and used the improved Mask R-CNN models to predict distortion type and loss information. He et al [17] proposed Masked Autoencoders (SSL-MAE) for ViT backbone models self-supervised pretraining.…”
Section: Self-supervised Learning and Masked Autoencodermentioning
confidence: 99%
“…There is a normalization layer between encoder and decoder in the original SSL-MAE, but its impact on the downstream segmentation task was not notable, therefore it was retained for SSL-MAE pretraining. Influenced by Felfeliyan's work [16], we investigated changing the original mean squared error (MSE) loss function to the root mean squared error (RMSE) + mean absolute error (MAE) loss over the masked patches reconstruction. Other architectures were maintained the same as the SSL-MAE paper.…”
Section: B Ssl-mae Self-supervised Pretrainingmentioning
confidence: 99%