2022
DOI: 10.48550/arxiv.2212.12738
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

T2-GNN: Graph Neural Networks for Graphs with Incomplete Features and Structure via Teacher-Student Distillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…One needs to ensure that paired features are similar both in modeling capacity and relevance to the output. Most research on feature-based distillation on graphs has so far focused on models that only have one type of (scalar) features in single-output classification tasks [26,27,28,29], thereby reducing the problem to the selection of layers to pair across the student and the teacher. This is often further simplified by utilizing models of the same architecture.…”
Section: Knowledge Distillation In Molecular Gnnsmentioning
confidence: 99%
“…One needs to ensure that paired features are similar both in modeling capacity and relevance to the output. Most research on feature-based distillation on graphs has so far focused on models that only have one type of (scalar) features in single-output classification tasks [26,27,28,29], thereby reducing the problem to the selection of layers to pair across the student and the teacher. This is often further simplified by utilizing models of the same architecture.…”
Section: Knowledge Distillation In Molecular Gnnsmentioning
confidence: 99%