2018
DOI: 10.48550/arxiv.1809.00852
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multi-target Unsupervised Domain Adaptation without Exactly Shared Categories

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 16 publications
(19 citation statements)
references
References 17 publications
0
19
0
Order By: Relevance
“…Another group of methods utilize a model trained on source data to generate pseudo label on the target data in a self-training (ST) manner. Instead of assuming that the target data comes from a single domain, Yu et al [42] and Gholami et al [10] address a challenging multi-target UDA problem. However, these methods are not directly applicable to the FMTDA setting since they assume centralized data on a server.…”
Section: Unsupervised Domain Adaptationmentioning
confidence: 99%
See 1 more Smart Citation
“…Another group of methods utilize a model trained on source data to generate pseudo label on the target data in a self-training (ST) manner. Instead of assuming that the target data comes from a single domain, Yu et al [42] and Gholami et al [10] address a challenging multi-target UDA problem. However, these methods are not directly applicable to the FMTDA setting since they assume centralized data on a server.…”
Section: Unsupervised Domain Adaptationmentioning
confidence: 99%
“…The problem of domain mismatch among multiple clients, or target domains, is challenging even in a centralized training framework. Although some methods have been proposed for centralized multi-target DA [42,10], the best way to adapt to diverse domains in real-world scenarios remains unclear. To disentangle the challenges of multi-target DA and distributed training, we also look into the problem in a centralized setting.…”
Section: Centralized Multi-target Domain Adaptationmentioning
confidence: 99%
“…Multi-Source/Target UDA. Multi-source domain adaptation (MSDA) [21,36,45,26] and multi-target domain adaptation (MTDA) [44,12] consider UDA where either the source or target domain consists of multiple subdomains. While these studies assume that the subdomain labels of all samples are available, some others assumed that they are unavailable in the source [20] or target domain (Blendingtarget domain adaptation: BTDA) [7,27].…”
Section: Related Workmentioning
confidence: 99%
“…According to the numbers of source domain and target domain, we can group UDA methods into three scenarios: single source domain with single target domain (1S1T) [23], [24], [25], [26], single source domain with multi-target domains (1SmT) [27] and multi-source domains with single target domain (mS1T), as shown in Fig. 1.…”
Section: Classifier-level Adaptationmentioning
confidence: 99%
“…To this end, the 1SmT modeling strategy has been proposed. One of the representative is the PA-1SmT [27], which achieved UDA by transferring knowledge from the source domain to each of the target domains. However, it does not consider the domain shift in distributions when aligning the target domains to the source.…”
Section: Classifier-level Adaptationmentioning
confidence: 99%