2019
DOI: 10.1049/iet-ipr.2018.5871
|View full text |Cite
|
Sign up to set email alerts
|

Transfer metric learning for unsupervised domain adaptation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 31 publications
0
6
0
Order By: Relevance
“…We compare DMSP against the no adaptation baseline (i.e., 1-NN) and the state-of-the-art DA methods such as feature adaptation methods (i.e., transfer component analysis (TCA) [10], GFK [30], subspace alignment (SA) [12], JDA [21], group-lasso regularized optimal transport (OT-GL) [11], joint geometrical and statistical alignment (JGSA) [35], UTML [23], structure preservation and distribution alignment (SPDA) [24]), adaptive classifier learning methods (i.e., DCA [28], ARTL [15], MDDA [29]), deep DA methods (i.e., deep adaptation networks (DAN) [36], domain-adversarial neural network (DANN) [37], joint adaptation network (JAN) [38], collaborative and adversarial network (CAN) [39], conditional domain-adversarial network (CDAN) [40], domain-adversarial residual-transfer (DART) [41], multirepresentation adaptation Network (MRAN) [42], and discriminative manifold propagation (DMP) [43]).…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…We compare DMSP against the no adaptation baseline (i.e., 1-NN) and the state-of-the-art DA methods such as feature adaptation methods (i.e., transfer component analysis (TCA) [10], GFK [30], subspace alignment (SA) [12], JDA [21], group-lasso regularized optimal transport (OT-GL) [11], joint geometrical and statistical alignment (JGSA) [35], UTML [23], structure preservation and distribution alignment (SPDA) [24]), adaptive classifier learning methods (i.e., DCA [28], ARTL [15], MDDA [29]), deep DA methods (i.e., deep adaptation networks (DAN) [36], domain-adversarial neural network (DANN) [37], joint adaptation network (JAN) [38], collaborative and adversarial network (CAN) [39], conditional domain-adversarial network (CDAN) [40], domain-adversarial residual-transfer (DART) [41], multirepresentation adaptation Network (MRAN) [42], and discriminative manifold propagation (DMP) [43]).…”
Section: Methodsmentioning
confidence: 99%
“…Based on MMD, joint distribution adaptation (JDA) [21] minimizes the class-wise distribution distance between the source and target domains to match their marginal and conditional distribution difference. Furthermore, to mine discriminative information within the two domains, domain invariant and class discriminative feature learning (DICD) [22] also minimizes the intra-class scatter and maximizes the inter-class dispersion simultaneously, while unsupervised metric transfer learning method (UMTL) [23] also maximizes the inter-class distance.…”
Section: Distribution Matchingmentioning
confidence: 99%
See 2 more Smart Citations
“…We compare the MMDA approach with some state‐of‐the‐art methods, namely, GFK [21], JDA [8], TJM [31], SCA [40], JGSA [5], DMM [32], unsupervised transfer metric learning (UTML) [60], locality preserving joint transfer (LPJT) [45], and domain invariant and class discriminative feature learning (DICD) [59]. Three deep approaches, namely, AlexNet [61], deep domain confusion (DDC) [62], and deep adaptation network (DAN) [47], are also tested for comparison.…”
Section: Experiments and Evaluationsmentioning
confidence: 99%