2018
DOI: 10.48550/arxiv.1810.03944
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Transfer Metric Learning: Algorithms, Applications and Outlooks

Yong Luo,
Yonggang Wen,
Ling-Yu Duan
et al.

Abstract: Distance metric learning (DML) aims to find an appropriate way to reveal the underlying data relationship. It is critical in many machine learning, pattern recognition and data mining algorithms, and usually require large amount of label information (such as class labels or pair/triplet constraints) to achieve satisfactory performance. However, the label information may be insufficient in real-world applications due to the high-labeling cost, and DML may fail in this case. Transfer metric learning (TML) is abl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 68 publications
0
6
0
Order By: Relevance
“…From the perspective of transfer learning methods, there are three main categories: (1) Instance re-weighting, which reuses samples according to some weighting technique [15,16,56]; (2) Feature transformation, which performs representation learning to transform the source and target domains into the same subspace [38,45,62,74]; (3) Transfer metric learning [42][43][44], which learns transferable metric between domains. Since our proposed methods are mainly related to feature-based transfer learning, we will extensively introduce the related work in the following aspects.…”
Section: Related Workmentioning
confidence: 99%
“…From the perspective of transfer learning methods, there are three main categories: (1) Instance re-weighting, which reuses samples according to some weighting technique [15,16,56]; (2) Feature transformation, which performs representation learning to transform the source and target domains into the same subspace [38,45,62,74]; (3) Transfer metric learning [42][43][44], which learns transferable metric between domains. Since our proposed methods are mainly related to feature-based transfer learning, we will extensively introduce the related work in the following aspects.…”
Section: Related Workmentioning
confidence: 99%
“…There are other geometrical distances or transforms such GFK [31] and subspace learning [32,33]. The implicit distance indirectly bridges the distribution gap through adversarial nets [34], or learnable metrics [35]. GAN-based DA methods learn domain-invariant features by confusing the feature extractor and discriminator [8][9][10], while metric learning [35] focuses on the sample-wise distance.…”
Section: Related Workmentioning
confidence: 99%
“…The implicit distance indirectly bridges the distribution gap through adversarial nets [34], or learnable metrics [35]. GAN-based DA methods learn domain-invariant features by confusing the feature extractor and discriminator [8][9][10], while metric learning [35] focuses on the sample-wise distance. Recent research implies performance improvement by adding more prior to the matching strategy such as adaptive weights between marginal and conditional distributions [7,14,15] with weights generated by the A-distance [2].…”
Section: Related Workmentioning
confidence: 99%
“…Finding a correct distance metric and using the learned metric to fit a good classifier is very important in fine-grained classification tasks. Transfer metric learning methods, which combine transfer learning and metric learning techniques, have been widely used in many applications [34]. For example, Deng et al [35] proposed a deep metric learning feature embedding model suitable for unsupervised transfer learning, and it can learn the similarity between sample pairs.…”
Section: Introductionmentioning
confidence: 99%