2022
DOI: 10.1109/tnnls.2021.3082928
|View full text |Cite
|
Sign up to set email alerts
|

Transductive Relation-Propagation With Decoupling Training for Few-Shot Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(7 citation statements)
references
References 43 publications
0
7
0
Order By: Relevance
“…TLRM [7] proposed a sample-to-task relation module to capture the task-level relation representations in each GNN layer. TRPN-D [8] adopted the decoupling training strategy to preserve the diversity across different fewshot tasks to enhance the generalizability of GNN models. GCLR [9] applied a VAE-based encoder-decoder module to enrich the node representations in the latent feature space.…”
Section: B Graph Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…TLRM [7] proposed a sample-to-task relation module to capture the task-level relation representations in each GNN layer. TRPN-D [8] adopted the decoupling training strategy to preserve the diversity across different fewshot tasks to enhance the generalizability of GNN models. GCLR [9] applied a VAE-based encoder-decoder module to enrich the node representations in the latent feature space.…”
Section: B Graph Neural Networkmentioning
confidence: 99%
“…Compared with CNNs, Graph Networks are more powerful in exploiting the intra-and interclass relationships amongst samples, which are thus more effective for few-shot learning. The current GNN-based fewshot methods improve the accuracy and generalizability from the perspective of node/edge update [8], [9], [12], [13] and graph structure design [5], [14], [15]. In general, graph-based methods model the feature embeddings of samples as vertices in a graph and propagate label information between nodes by performing node or edge feature aggregation from neighbor nodes with graph convolution.…”
Section: Introductionmentioning
confidence: 99%
“…In this work, we focus on the third type, namely the metric learning-based methods [44,[59][60][61][62][63][64][65][66][67][68], i.e., to learn the discriminative feature embeddings for distinguishing different image classes. For example, ProtoNet [59] considered the class-mean representation as the prototype of each class and applied the Euclidean distance metric for classification.…”
Section: Metric Learning-based Methodsmentioning
confidence: 99%
“…Classic few-shot methods [14,59,61,115] applied Convolutional Neural Networks (CNNs) for image classification. More recent works proposed to apply the Graph Neural Networks [65,[116][117][118] or Graph Convolutional Networks [71,119] to process data with rich relational structures in few-shot scenarios. Compared with CNNs, Graph Networks are more powerful in exploiting the intra-and inter-class relationships amongst samples, which are thus more effective for few-shot learning.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation