2020
DOI: 10.48550/arxiv.2007.12684
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Co-Training with Task Decomposition for Semi-Supervised Domain Adaptation

Abstract: Semi-supervised domain adaptation (SSDA) aims to adapt models from a labeled source domain to a different but related target domain, from which unlabeled data and a small set of labeled data are provided. In this paper we propose a new approach for SSDA, which is to explicitly decompose SSDA into two subproblems: a semi-supervised learning (SSL) problem in the target domain and an unsupervised domain adaptation (UDA) problem across domains. We show that these two sub-problems yield very different classifiers, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(27 citation statements)
references
References 55 publications
0
27
0
Order By: Relevance
“…Compared with the competing approaches using Resnet34 as the backbone, CDAC also achieves the best results in all cases and surpasses the current best results by 6% and 4.3% in the settings of 1-shot and 3-shot. Note that "MiCo" proposed in [42] is an unpublished work concurrent with ours, and the average performance of our CDAC using ResNet34 as the backbone is 0.4% higher than "MiCo" under the 3-shot setting.…”
Section: Comparisons With the State-of-the-artsmentioning
confidence: 81%
See 2 more Smart Citations
“…Compared with the competing approaches using Resnet34 as the backbone, CDAC also achieves the best results in all cases and surpasses the current best results by 6% and 4.3% in the settings of 1-shot and 3-shot. Note that "MiCo" proposed in [42] is an unpublished work concurrent with ours, and the average performance of our CDAC using ResNet34 as the backbone is 0.4% higher than "MiCo" under the 3-shot setting.…”
Section: Comparisons With the State-of-the-artsmentioning
confidence: 81%
“…Semi-supervised domain adaptation (SSDA) is a relatively promising form of transfer learning, which intents to leverage a small number of labeled samples (e.g, one or few samples per class) in the target domain and give full play to their potential to greatly improve the performance of domain adaptation. Recently, SSDA has recently attracted wide attentions [32,29,15,21,17,42] from researchers. [32,29] first proposed to solve SSDA by align-ing the features from both domains by means of adversarial learning.…”
Section: Semi-supervised Domain Adaptationmentioning
confidence: 99%
See 1 more Smart Citation
“…Isobe et al (2021) employ the KL divergence term as an objective term, however, their model is developed for the multi-target setting. Other relevant approaches in this area include combining semi-supervised learning and unsupervised domain adaptation to guide source classifiers (Yang et al 2020b), and using attention mechanism to assign target data points to source domains (Cui and Bollegala 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Also aims to reduce the data distribution mismatch, compared with UDA, semi- supervised domain adaptation (SSDA) bridges domain discrepancy via introducing partially labeled target samples. Recently, a few methods have been proposed based on deep learning [46,32,22,35] for image classification. [46] decomposes SSDA into two sub-problems: UDA and SSL, and employ co-training [3] to exchange the expertise between two classifiers, which are trained on MixUp-ed [48] data between labeled and unlabeled data of each view.…”
Section: Related Workmentioning
confidence: 99%