2019
DOI: 10.1109/access.2019.2958736
|View full text |Cite
|
Sign up to set email alerts
|

A Kernelized Unified Framework for Domain Adaptation

Abstract: The performance of the supervised learning algorithms such as k-nearest neighbor (k-NN) depends on the labeled data. For some applications (Target Domain), obtaining such labeled data is very expensive and labor-intensive. In a real-world scenario, the possibility of some other related application (Source Domain) is always accompanied by sufficiently labeled data. However, there is a distribution discrepancy between the source domain and the target domain application data as the background of collecting both t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
16
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 24 publications
(16 citation statements)
references
References 32 publications
0
16
0
Order By: Relevance
“…When searching for a new feature space for both domains, the searched space for the target domain may be irrelevant feature space [21]. Therefore, to ensure that the searched space is the relevant feature space, we need to maximize the target domain variance.…”
Section: B Maximization Of Target Domain Variance (Tv)mentioning
confidence: 99%
See 2 more Smart Citations
“…When searching for a new feature space for both domains, the searched space for the target domain may be irrelevant feature space [21]. Therefore, to ensure that the searched space is the relevant feature space, we need to maximize the target domain variance.…”
Section: B Maximization Of Target Domain Variance (Tv)mentioning
confidence: 99%
“…• ILS [39]: This method uses the Riemannian optimization techniques for matching the first and second-order statistics in the latent space of both domains. • KUFDA [21]: KUFDA improves JGSA by incorporating the original geometric structure of the data with a robust Laplacian term. • UnPSO [47]: By using the PSO algorithm, UnPSO selects a good subset of features across both domains and use MMD for minimizing the distribution gap between domains.…”
Section: E Comparison With Other Baseline Approachesmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the feature space obtained by JGSA is not notable because data samples in this space may lose their original similarity so that they can be easily misclassified by the classifier. Kernelized Unified Framework for Domain Adaptation (KUFDA) [16] improves JGSA by adopting the original similarity weight matrix term so that the sample does not lose its original similarity in the learned space. KUFDA follows most of the above discussed important properties but still suffers from outlier data samples, and this is due to not considering instance re-weighting term.…”
Section: Introductionmentioning
confidence: 99%
“…Kernelized Unified Framework for Domain Adaptation (KUFDA) [16]: This TL method improves the JGSA method by adding the Laplacian regularization term.…”
mentioning
confidence: 99%