2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00261
|View full text |Cite
|
Sign up to set email alerts
|

Taking a Closer Look at Domain Shift: Category-Level Adversaries for Semantics Consistent Domain Adaptation

Abstract: We consider the problem of unsupervised domain adaptation in semantic segmentation. A key in this campaign consists in reducing the domain shift, i.e., enforcing the data distributions of the two domains to be similar. One of the common strategies is to align the marginal distribution in the feature space through adversarial learning. However, this global alignment strategy does not consider the category-level joint distribution. A possible consequence of such global movement is that some categories which are … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

4
525
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 708 publications
(553 citation statements)
references
References 49 publications
4
525
0
Order By: Relevance
“…We compare the proposed Py-CDA with several state-of-the-art methods. A majority of them uses adversarial training to bring closer the source and target domains on the feature level (ROAD [4]), on both features and pixels (FCAN [40], CyCADA [17]), on the output maps (OutputAdapt [35], CLAN [26]), and combining adversarial learning with entropy minimization (AD-VENT [36]). In contrast, our PyCDA approach, along with CDA [38], and ST [43], adapts the neural networks by posterior regularization instead.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…We compare the proposed Py-CDA with several state-of-the-art methods. A majority of them uses adversarial training to bring closer the source and target domains on the feature level (ROAD [4]), on both features and pixels (FCAN [40], CyCADA [17]), on the output maps (OutputAdapt [35], CLAN [26]), and combining adversarial learning with entropy minimization (AD-VENT [36]). In contrast, our PyCDA approach, along with CDA [38], and ST [43], adapts the neural networks by posterior regularization instead.…”
Section: Methodsmentioning
confidence: 99%
“…In this work, we also view the self-training [43] as a curriculum-style domain adaptation method. The other line of work is to reduce the domain shift in feature space or output space and tries to seek a better way to align both domains in an intermediate layer [4,18,40,35,27,42,5,17,19,33,26]. In [18] Figure 2: Overview of our self-motivated pyramid curriculum domain adaptation (PyCDA) approach to segmentation.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…However, anatomy-consistency is not always guaranteed without explicitly enforcing semantic consistency on content space. As for feature-level adaptation, while it seems effective for tasks like classification, it is unclear how well it might scale to dense structured domain adaptation [19,16].…”
Section: Related Workmentioning
confidence: 99%