Proceedings of the ACM Web Conference 2023 2023
DOI: 10.1145/3543507.3583386
|View full text |Cite
|
Sign up to set email alerts
|

GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 38 publications
(31 citation statements)
references
References 20 publications
0
28
0
Order By: Relevance
“…In addition, Tan et al [196], [197] employ contrastive learning, by constructing a subgraph around a target node based on connectivity and generating three types of contrast pairs (node-node, node-subgraph, and subgraph-subgraph) for model training. Another work GraphPrompt [198] employs GNN prompting techniques, unifying pre-training and downstream tasks for effective knowledge transfer, achieving impressive results even with limited supervision.…”
Section: Few-shot Node Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, Tan et al [196], [197] employ contrastive learning, by constructing a subgraph around a target node based on connectivity and generating three types of contrast pairs (node-node, node-subgraph, and subgraph-subgraph) for model training. Another work GraphPrompt [198] employs GNN prompting techniques, unifying pre-training and downstream tasks for effective knowledge transfer, achieving impressive results even with limited supervision.…”
Section: Few-shot Node Classificationmentioning
confidence: 99%
“…The primary challenge in few-shot node classification is to extract transferable knowledge from base classes to benefit novel classes. Recent studies in this area, as summarized in Notably, the adoption of innovative techniques like prompt tuning [198], generative models [71], and others could potentially enhance the representation of novel classes, indicating promising future directions. For a broader perspective, we suggest consulting the survey of few-shot learning on graphs [30] for additional insights.…”
Section: Fsnc On Hinsmentioning
confidence: 99%
See 1 more Smart Citation
“…Subsequent research explores soft prompts [20,22,28], employing continuous vectors to surpass the limitations of natural language embeddings. In graph prompt tuning, studies have actively investigated the advantages [7,29,40], with a focus on soft prompts due to the unique nature of graph data. GPPT [40] introduces a "pre-train, prompt, fine-tune" framework to mitigate the objective gap using soft prompt, but it only deals with link prediction for pretraining and node classification for downstream task.…”
Section: Related Workmentioning
confidence: 99%
“…However, when it comes to Graph Neural Networks (GNNs), the landscape of prompt tuning methods remains largely unexplored, particularly in terms of universal applicability [7]. Existing prompt tuning methods in GNNs are often tightly coupled with specific pre-training strategies, limiting their broader application [29,40]. For example, a prompt tuning method designed for a GNN pretrained on node classification tasks may not be effective for a GNN pre-trained on link prediction tasks.…”
Section: Introductionmentioning
confidence: 99%