2023
DOI: 10.1109/tmm.2022.3233442
|View full text |Cite
|
Sign up to set email alerts
|

Graph Neural Networks With Triple Attention for Few-Shot Learning

Abstract: Recent advances in Graph Neural Networks (GNNs) have achieved superior results in many challenging tasks, such as few-shot learning. Despite its capacity to learn and generalize a model from only a few annotated samples, GNN is limited in scalability, as deep GNN models usually suffer from severe over-fitting and over-smoothing. In this work, we propose a novel GNN framework with a triple-attention mechanism, i.e., node self-attention, neighbor attention, and layer memory attention, to tackle these challenges.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 52 publications
0
0
0
Order By: Relevance
“…15 Examples of attributed networks include citation networks, social networks, biological networks, and critical infrastructure networks, making them directly applicable to battlefield and spectrum management graphs. Much like others who have adopted the attention mechanism in GNN few-shot learning for computer vision, 16 Wang augments the GNN with attention, but moves it to the attribute level, thus enabling few-shot learning on traditional graph structured data.…”
Section: Few-shot Learningmentioning
confidence: 99%
“…15 Examples of attributed networks include citation networks, social networks, biological networks, and critical infrastructure networks, making them directly applicable to battlefield and spectrum management graphs. Much like others who have adopted the attention mechanism in GNN few-shot learning for computer vision, 16 Wang augments the GNN with attention, but moves it to the attribute level, thus enabling few-shot learning on traditional graph structured data.…”
Section: Few-shot Learningmentioning
confidence: 99%
“…In this work, we propose an attentive graph neural network (AGNN) [136,137] with a novel triple-attention mechanism, i.e., node self-attention, neighbor atten- The contributions of this work are summarized as follows,…”
Section: Introductionmentioning
confidence: 99%