2022
DOI: 10.3390/sym14061097
|View full text |Cite
|
Sign up to set email alerts
|

TACDFSL: Task Adaptive Cross Domain Few-Shot Learning

Abstract: Cross Domain Few-Shot Learning (CDFSL) has attracted the attention of many scholars since it is closer to reality. The domain shift between the source domain and the target domain is a crucial problem for CDFSL. The essence of domain shift is the marginal distribution difference between two domains which is implicit and unknown. So the empirical marginal distribution measurement is proposed, that is, WDMDS (Wasserstein Distance for Measuring Domain Shift) and MMDMDS (Maximum Mean Discrepancy for Measuring Doma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 35 publications
1
2
0
Order By: Relevance
“…In addition, they also showed that using a larger K can enable a higher classification performance. Other studies (Adler et al., 2020; Cai & Shen, 2020; Jiang et al., 2020) used different meta‐learning methods to conduct a similar experiment, and they achieved results similar to those reported by Guo et al. (2020).…”
Section: Application Of Few‐shot Learning For Model and Optimizationsupporting
confidence: 65%
See 1 more Smart Citation
“…In addition, they also showed that using a larger K can enable a higher classification performance. Other studies (Adler et al., 2020; Cai & Shen, 2020; Jiang et al., 2020) used different meta‐learning methods to conduct a similar experiment, and they achieved results similar to those reported by Guo et al. (2020).…”
Section: Application Of Few‐shot Learning For Model and Optimizationsupporting
confidence: 65%
“…In addition, they also showed that using a larger K can enable a higher classification performance. Other studies (Adler et al, 2020;Cai & Shen, 2020;Jiang et al, 2020) used different meta-learning methods to conduct a similar experiment, and they achieved results similar to those reported by Guo et al (2020). Thus, the performance of meta-learning is expected to improve when the target task is similar to the source tasks and K is large; otherwise, it yields unstable performance.…”
Section: Meta-learningmentioning
confidence: 60%
“…the left part of Figure 13 (c). For the former category, in [134], WDMDS (Wasserstein Distance for Measuring Domain Shift) and MMDMDS (Maximum Mean Discrepancy for Measuring Domain Shift) were proposed to solve CDFSL. [118] introduced the MemREIN framework which considers memorization, restitution, and instance normalization, e.g.…”
Section: Feature Transformationmentioning
confidence: 99%