2016 23rd International Conference on Pattern Recognition (ICPR) 2016
DOI: 10.1109/icpr.2016.7899859
|View full text |Cite
|
Sign up to set email alerts
|

Adapting instance weights for unsupervised domain adaptation using quadratic mutual information and subspace learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(13 citation statements)
references
References 11 publications
0
13
0
Order By: Relevance
“…Based on instance transfer, the majority of the literature utilized the measurement method to evaluate the similarity between data from the source and target domains. The similarity metric was then converted into the transfer weight coefficient, which was directly used to instance transfer by re-weighting the source domain data [30][31][32]. Herein, we have listed a few typical methods based on instance transfer.…”
Section: Transfer Learning Based On Instance Knowledgementioning
confidence: 99%
“…Based on instance transfer, the majority of the literature utilized the measurement method to evaluate the similarity between data from the source and target domains. The similarity metric was then converted into the transfer weight coefficient, which was directly used to instance transfer by re-weighting the source domain data [30][31][32]. Herein, we have listed a few typical methods based on instance transfer.…”
Section: Transfer Learning Based On Instance Knowledgementioning
confidence: 99%
“…To efficiently use the similarity with the target domain, some source domain data samples are reused according to weight generation rules to carry out the transfer learning [21][22][23][24]. The instance-based weight method has rich theoretical achievements and is easily deduced and used.…”
Section: Instance-based Transfer Learningmentioning
confidence: 99%
“…Therefore, it is a more challenging task compared with the semi-supervised scenario. There are multiple UDA methods have been proposed [19], [20], [21], [22], [23], [24], [25] in recent years which achieved promising results. These methods could be grouped into three categories, which are 1) source-target divergence metric-based methods, 2) generalization extension-based methods, and 3) constant term-based methods [26].…”
Section: A Unsupervised Domain Adaptationmentioning
confidence: 99%