2020
DOI: 10.1109/access.2020.3012152
|View full text |Cite
|
Sign up to set email alerts
|

Cross Domain Mean Approximation for Unsupervised Domain Adaptation

Abstract: Unsupervised Domain Adaptation (UDA) aims to leverage the knowledge from the labeled source domain to help the task of target domain with the unlabeled data. It is a key step for UDA to minimize the cross-domain distribution divergence. In this paper, we firstly propose a novel discrepancy metric, referred to as Cross Domain Mean Approximation (CDMA) discrepancy, to evaluate the distribution differences between source and target domains, which calculate the sum of the squares of the distances from the source a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 57 publications
0
19
0
Order By: Relevance
“…Here, inspired by [ 35 ], we introduce cross-domain mean approximation to replace the prediction loss. When there are no labeled samples in the target domains (when c =0), we force target data H T close to source data mean point H S _ av , which promotes domain adaptation seen from [ 35 ]. If the target sample obtains pseudo labels, it is drawn to source data mean point with the same category H S _ av ( c ) .…”
Section: Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Here, inspired by [ 35 ], we introduce cross-domain mean approximation to replace the prediction loss. When there are no labeled samples in the target domains (when c =0), we force target data H T close to source data mean point H S _ av , which promotes domain adaptation seen from [ 35 ]. If the target sample obtains pseudo labels, it is drawn to source data mean point with the same category H S _ av ( c ) .…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Hence, we can learn the other DAELM. It is worth emphasizing that we present target cross-domain mean approximation referred to [ 35 ] to adapt the distribution of the target domain for consistency with the source domains. At prediction stage, the above two DAELMs jointly determined the category of test samples.…”
Section: Introductionmentioning
confidence: 99%
“…In transfer learning, domain adaptation accelerates the cross-domain transfer of knowledge by minimizing the discrepancy between domains. According to “how to correct interdomain distribution mismatch,” domain adaptation can be roughly divided into three categories: sample weighting, subspace and manifold alignment, and statistical distribution alignment [ 33 ].…”
Section: Related Workmentioning
confidence: 99%
“…Domain adaptation [ 31 33 ], as an important branch of transfer learning, solves the above problems with the help of the knowledge from the source domain which is different from but related to the target domain and resolves the inconsistency of sample distribution between the source and target domains. Zhang and Zhang [ 34 ] extended ELM to handle domain adaptation problems with very few labeled guide samples in target domain and overcome the generalization disadvantages of ELM in multidomain application.…”
Section: Introductionmentioning
confidence: 99%
“…It should be pointed out that the methods mentioned above [8], [10], [23], [24] ignore a fact that samples after projection may not be discriminative enough for the final classification. And in UDA, the labels of the target domain are not available [25].…”
Section: Introductionmentioning
confidence: 99%