2021
DOI: 10.1109/tnnls.2020.3017213
|View full text |Cite
|
Sign up to set email alerts
|

Open Set Domain Adaptation: Theoretical Bound and Algorithm

Abstract: The aim of unsupervised domain adaptation is to leverage the knowledge in a labeled (source) domain to improve a model's learning performance with an unlabeled (target) domain -the basic strategy being to mitigate the effects of discrepancies between the two distributions. Most existing algorithms can only handle unsupervised closed set domain adaptation (UCSDA), i.e., where the source and target domains are assumed to share the same label set. In this paper, we target a more challenging but realistic setting:… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
50
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 142 publications
(57 citation statements)
references
References 47 publications
1
50
0
Order By: Relevance
“…It is important to construct transfer error rate err(h, T s , T t ) and heterogeneous space alignment d(h, T s , T t ). Motivated by [50], [52], in our kernel-based algorithm err(h, T s , T t ) = R s (h • T s ) + ρD 2 h (P Tt(Xt) , P Ts(Xs) ),…”
Section: Bringing Ssheda Theory Into Realitymentioning
confidence: 99%
See 2 more Smart Citations
“…It is important to construct transfer error rate err(h, T s , T t ) and heterogeneous space alignment d(h, T s , T t ). Motivated by [50], [52], in our kernel-based algorithm err(h, T s , T t ) = R s (h • T s ) + ρD 2 h (P Tt(Xt) , P Ts(Xs) ),…”
Section: Bringing Ssheda Theory Into Realitymentioning
confidence: 99%
“…Then, to preserve domains' geometry structures, such as manifold structure and clustering structure, manifold regularization [40] is considered in KHDA. Many kernel-based DA algorithms [50], [52], [56] have studied the manifold regularization and shown that it can help to improve the transfer performance. One can write the manifold regularizations M 1 (h * , T s , T t ), M 2 (h, T t , T t ) as follows:…”
Section: Loss Function In Khdamentioning
confidence: 99%
See 1 more Smart Citation
“…Unsupervised Domain Adaptation. Unsupervised domain adaptation (UDA) has gained considerable interests in many practical applications recently (Shao, Zhu, and Li 2014;Hoffman et al 2018Hoffman et al , 2014Ghafoorian et al 2017;Kamnitsas et al 2017;Wang and Zheng 2015;Blitzer, McDonald, and Pereira 2006;Fang et al 2021b;Dong et al 2021b), which aims to learn a model on data from the labeled source domain and transfer the learned information to a new unlabeled domain with distribution shift (Pan and Yang 2009). The key to the success of UDA is to learn a latent domain-invariant representation by minimizing the difference between the two domains (i.e., domain discrepancy) with certain criteria, such as maximum mean discrepancy (Pan et al 2010), Kullback-Leibler divergence (Zhuang et al 2015), central moment discrepancy (Zellinger et al 2017), and Wasserstein distance (Lee and Raginsky 2017).…”
Section: Related Workmentioning
confidence: 99%
“…These data belong to unknown classes we are not interested in and are usually expected to be recognised as one unified class to discriminate them from known classes. This problem has been studied as the open set domain adaptation problem in literature (Saito et al 2018;Liu et al 2019;Bucci, Loghmani, and Tommasi 2020;Luo et al 2020;Fang et al 2020).…”
Section: Introductionmentioning
confidence: 99%