2020
DOI: 10.48550/arxiv.2006.07776
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Domain Adaptation and Image Classification via Deep Conditional Adaptation Network

Abstract: Unsupervised domain adaptation aims to generalize the supervised model trained on a source domain to an unlabeled target domain. Marginal distribution alignment of feature spaces is widely used to reduce the domain discrepancy between the source and target domains. However, it assumes that the source and target domains share the same label distribution, which limits their application scope. In this paper, we consider a more general application scenario where the label distributions of the source and target dom… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 43 publications
0
4
0
Order By: Relevance
“…For quantitative assessment of the proposed strong-weak integrated semi-supervision (SWISS) method, we evaluate it on two popular benchmarks Office-Home [23], DomainNet [2]; and, we compare our proposed method with state-of-the-art methods, namely, DADA [20], CDAN+BSP [16], MCC [22], DCAN [17], SHOT-IM [12], and the most recent FixBi [18], SCDA [19], AANet [4]. We employ the pre-trained ResNet-50 or ResNet-101 as the feature generator G as was done in [12,24,17,21]. For the classifier F, we use a linear layer with the input and output dimensions being (1024, k), where k is the number of classes.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…For quantitative assessment of the proposed strong-weak integrated semi-supervision (SWISS) method, we evaluate it on two popular benchmarks Office-Home [23], DomainNet [2]; and, we compare our proposed method with state-of-the-art methods, namely, DADA [20], CDAN+BSP [16], MCC [22], DCAN [17], SHOT-IM [12], and the most recent FixBi [18], SCDA [19], AANet [4]. We employ the pre-trained ResNet-50 or ResNet-101 as the feature generator G as was done in [12,24,17,21]. For the classifier F, we use a linear layer with the input and output dimensions being (1024, k), where k is the number of classes.…”
Section: Methodsmentioning
confidence: 99%
“…Baselines. For single-target domain adaptation, we compare with several classical and state-of-the-art methods, namely, the classical DANN [39], MSTN [40], ADDA [41], MCD [42], RTN [43], JAN [44], CDAN+BSP [45], CAN [46], MDD [47], DCAN [48], SHOT-IM [29], MIMTFL [49], and the most recent FixBi [23], ATDOC [22], SCDA [50], AANet [7]. For multi-target domain adaptation, we compare with three recent works MT-MTDA [12], D-CGCT [14], and HGAN [9].…”
Section: Experiments a Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…To address the domain shift problem, unsupervised domain adaptation (UDA) methods, which use only data on the source domain without data on the target domain, have been proposed [9,10,12,13,15,[25][26][27]. Most are designed for classification tasks but not for detection tasks.…”
Section: Introductionmentioning
confidence: 99%