ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9414297
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised Skin Lesion Segmentation with Learning Model Confidence

Abstract: Segmentation of skin lesions is important for disease diagnoses and treatment planning. Over the years, semisupervised methods using pseudo labels have boosted the segmentation performance with limited labeled data and abundant unlabeled data. However, the unreliable targets in pseudo labels might lead to meaningless guidance for unlabeled data. In this paper, to solve this issue, we propose a novel confidence aware semi-supervised learning method based on a mean teacher scheme. Concretely, we design a confide… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(19 citation statements)
references
References 10 publications
(22 reference statements)
0
18
0
Order By: Relevance
“…By controlling for all other variables in Eqs. (14), we compare the JA index with different λ from 1 to 8. As shown in Table 3, setting the trade-off weight λ to 3 gives the best results on the ISIC2017 dataset.…”
Section: Hyperparameter Selectionmentioning
confidence: 99%
“…By controlling for all other variables in Eqs. (14), we compare the JA index with different λ from 1 to 8. As shown in Table 3, setting the trade-off weight λ to 3 gives the best results on the ISIC2017 dataset.…”
Section: Hyperparameter Selectionmentioning
confidence: 99%
“…Yu et al [52] extend the mean teacher paradigm with an uncertainty estimation strategy through Monte Carlo dropout [53]. Xie et al [54] add a confidence-aware module to learn the model confidence under the guidance of labeled data. Luo et al [55], [56] calculate uncertainty using pyramid predictions in one forward pass and proposed an multi-level uncertainty rectified pyramid consistency regularization.…”
Section: Unsupervised Regularization With Consistency Learningmentioning
confidence: 99%
“…Finally, we apply (3) to integrate the teacher model with more diverse "students" generated by (9). Notice that we apply ( 9) and ( 3) to the whole parameter set θ, which could increases the quality of consistency target f (x, ν ) in (5).…”
Section: Model Optimization Processmentioning
confidence: 99%
“…Deep neural networks leveraging a large number of labeled data have achieved remarkable performance on computer vision [1,2] and natural language pro-cessing [3,4] applications in recent years. However, labeling numerous data manually is extremely expensive or even impossible for many practical applications (e.g., medical image segmentation [5], destructive product testing [6]). In these situations, deep learning models are prone to overfitting and generalize badly on new data, due to that the labeled training data are scarce.…”
Section: Introductionmentioning
confidence: 99%