2020
DOI: 10.48550/arxiv.2007.08790
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Explanation-Guided Training for Cross-Domain Few-Shot Classification

Abstract: Cross-domain few-shot classification task (CD-FSC) combines few-shot classification with the requirement to generalize across domains represented by datasets. This setup faces challenges originating from the limited labeled data in each class and, additionally, from the domain shift between training and test sets. In this paper, we introduce a novel training approach for existing FSC models. It leverages on the explanation scores, obtained from existing explanation methods when applied to the predictions of FS… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 36 publications
0
7
0
Order By: Relevance
“…Motivated by it, the researchers [22] used LRP in GNN to obtain insights into the black-box of GNN models. Schnake et al [17] proposed GNN-LRP based higher-order Taylor expansions.…”
Section: Explaining Graph Neural Network Through Xai Methodsmentioning
confidence: 99%
“…Motivated by it, the researchers [22] used LRP in GNN to obtain insights into the black-box of GNN models. Schnake et al [17] proposed GNN-LRP based higher-order Taylor expansions.…”
Section: Explaining Graph Neural Network Through Xai Methodsmentioning
confidence: 99%
“…• We introduce a novel method of explanation-guided training of DNN to prioritize relevant features in the input during the training phase. • While the authors in the related work [28] presented a method of explanation-guided training to dynamically finds and emphasize the features which are important, our method aims to allow domain experts to prioritize important features in the input image by segmentation mask. Hence, the proposed method is useful when we want to prefer some well-known input features, but we do not want to restrict the model only to these features.…”
Section: Our Contributionmentioning
confidence: 99%
“…Meta-learning performance suffers on VTAB-v2 In contrast to BiT, Figure 1 shows that meta-learning approaches struggle to compete with transfer learning on VTAB-v2. MD- tunity to apply existing cross-domain few-shot classification approaches (Tseng et al, 2020;Sun et al, 2020;Phoo & Hariharan, 2020;Liu et al, 2020;Cai & Shen, 2020) at scale.…”
Section: Comparison Of Selected Approachesmentioning
confidence: 99%