2021
DOI: 10.48550/arxiv.2111.11629
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Uncertainty-Aware Deep Co-training for Semi-supervised Medical Image Segmentation

Abstract: Semi-supervised learning has made significant strides in the medical domain since it alleviates the heavy burden of collecting abundant pixel-wise annotated data for semantic segmentation tasks. Existing semi-supervised approaches enhance the ability to extract features from unlabeled data with prior knowledge obtained from limited labeled data. However, due to the scarcity of labeled data, the features extracted by the models are limited in supervised learning, and the quality of predictions for unlabeled dat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…For instance, Rizve et al [47] proposed a unified thresholding approach that combines probability and uncertainty thresholds to select low-entropy regions for SSL. Furthermore, Zheng et al [38] employed Monte Carlo Sampling as an estimation method to obtain lowentropy and high-entropy maps. By emphasizing the valuable regions (i.e.…”
Section: Contrastive Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, Rizve et al [47] proposed a unified thresholding approach that combines probability and uncertainty thresholds to select low-entropy regions for SSL. Furthermore, Zheng et al [38] employed Monte Carlo Sampling as an estimation method to obtain lowentropy and high-entropy maps. By emphasizing the valuable regions (i.e.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…To validate the overall segmentation performance of the proposed method, we conduct experiments and compared the results with those obtained using state-of-the-art SSL methods. These methods included Mean Teacher (MT) [33], EM [45], Deep Adversarial Network (DAN) [31], Deep Co-Training (DCT) [38], Cross Pseudo Supervision (CPS) [46], Cross Teaching between CNN and Transformer (CTCT) [43], and Uncertainty Rectified Pyramid Consistency (URPC) [21]. For a fair comparison, all methods used the same backbone and training setting.…”
Section: Comparison With State-of-the-artsmentioning
confidence: 99%