2021
DOI: 10.48550/arxiv.2101.10979
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Prototypical Pseudo Label Denoising and Target Structure Learning for Domain Adaptive Semantic Segmentation

Abstract: Self-training is a competitive approach in domain adaptive segmentation, which trains the network with the pseudo labels on the target domain. However inevitably, the pseudo labels are noisy and the target features are dispersed due to the discrepancy between source and target domains. In this paper, we rely on representative prototypes, the feature centroids of classes, to address the two issues for unsupervised domain adaptation. In particular, we take one step further and exploit the feature distances from … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
40
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(40 citation statements)
references
References 67 publications
0
40
0
Order By: Relevance
“…We use co-learning and integrate multiple fusion strategies to resist the noisy pseudo labels, as well as to retain informative samples. Experimental results show that the proposed MFA marginally surpasses [21,22].…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…We use co-learning and integrate multiple fusion strategies to resist the noisy pseudo labels, as well as to retain informative samples. Experimental results show that the proposed MFA marginally surpasses [21,22].…”
Section: Related Workmentioning
confidence: 99%
“…In these works, the labels are all available, which is different from the unsupervised domain adaptation problem. We note that [21,22] also consider the noise issue of pseudo labels on semantic segmentation adaptation. [22] estimates the uncertainty of predictions and reduces the impact of low-confidence samples during pseudo label learning.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations