2023
DOI: 10.1609/aaai.v37i4.25553
|View full text |Cite
|
Sign up to set email alerts
|

T2-GNN: Graph Neural Networks for Graphs with Incomplete Features and Structure via Teacher-Student Distillation

Abstract: Graph Neural Networks (GNNs) have been a prevailing technique for tackling various analysis tasks on graph data. A key premise for the remarkable performance of GNNs relies on complete and trustworthy initial graph descriptions (i.e., node features and graph structure), which is often not satisfied since real-world graphs are often incomplete due to various unavoidable factors. In particular, GNNs face greater challenges when both node features and graph structure are incomplete at the same time. The existing … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…Almost all recent Graph Self-Supervised Learning (GSSL) works employ two-layer GNN encoders. The GNNs generally consist of propagation and aggregation mechanism (Huo et al 2023;Kipf and Welling 2017;Veličković et al 2018). The normal form of GNNs is as follows:…”
Section: Preliminarymentioning
confidence: 99%
“…Almost all recent Graph Self-Supervised Learning (GSSL) works employ two-layer GNN encoders. The GNNs generally consist of propagation and aggregation mechanism (Huo et al 2023;Kipf and Welling 2017;Veličković et al 2018). The normal form of GNNs is as follows:…”
Section: Preliminarymentioning
confidence: 99%