2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00235
|View full text |Cite
|
Sign up to set email alerts
|

Blending-Target Domain Adaptation by Adversarial Meta-Adaptation Networks

Abstract: Figure 1. The comparison of MTDA and BTDA (color orange and blue denote source and target) setups. In MTDA (a), target domains are explicitly separated and we are informed by which target an unlabeled sample originates from. In BTDA (b), sub-target IDs are encrypted. If we treat them as a combined single target, transfer learning will lead to the adaptation on a multi-target mixture instead of each hidden target (gray distribution curves in (a), (b)). It implies category-shifted adaptation and negative transfe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
70
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 78 publications
(70 citation statements)
references
References 40 publications
0
70
0
Order By: Relevance
“…The C-MNIST data set is originally built by Lu et al Table 2 summarizes the performance comparisons on C-MNIST. The results clearly show that our methods significantly outperform the direct learning method, which proves the effectiveness of our 49.90 50.09 10.89 MCD-SWD [24] 50.12 49.89 10.69 ML-VAE [3] 77.26 28.06 18.73 DADA [48] 83.90 22.29 15.83 BTDA [8] 85.14 20.82 26.43 JiGen [5] 82. 44 [3] 75.29 36.07 7.97 DADA [48] 83.37 28.06 11.36 BTDA [8] 78.…”
Section: Handwritten Digital Experimentsmentioning
confidence: 66%
See 2 more Smart Citations
“…The C-MNIST data set is originally built by Lu et al Table 2 summarizes the performance comparisons on C-MNIST. The results clearly show that our methods significantly outperform the direct learning method, which proves the effectiveness of our 49.90 50.09 10.89 MCD-SWD [24] 50.12 49.89 10.69 ML-VAE [3] 77.26 28.06 18.73 DADA [48] 83.90 22.29 15.83 BTDA [8] 85.14 20.82 26.43 JiGen [5] 82. 44 [3] 75.29 36.07 7.97 DADA [48] 83.37 28.06 11.36 BTDA [8] 78.…”
Section: Handwritten Digital Experimentsmentioning
confidence: 66%
“…Recently, several domain-agnostic learning approaches [48,8,36,49] emerge to typically handle the domain-adaptation problem where the target domain may contain several sub-domains without domain labels [48]. DADA [48] and OCDA [36] propose novel mechanisms and achieve effective domain-adaptation performances, but do not discover latent domains in the target domain and exploit the information.…”
Section: Domain Agnostic Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Although single-target domain adaptation methods are sufficient when a well-defined target domain exists, target domains often consist of diverse distributions in the wild [11]. We therefore further validate our method in a multi-target domain adaptation setting [2], [11] that has not been explored much. We aim to simultaneously adapt to multiple target domains by one-time adaptation training.…”
Section: Methodsmentioning
confidence: 90%
“…When we cannot access the target data due to privacy issues [47], [53], [60], generalizing to an unseen domain is also necessary. We also experimented on a multitarget domain adaptation setting [2], [11], which aims to simultaneously adapt to multiple target domains, and a multiauxiliary domain generalization setting [62], which aims to generalize to an unseen test domain utilizing knowledge from auxiliary domains.…”
Section: Introductionmentioning
confidence: 99%