Proceedings of the Web Conference 2020 2020
DOI: 10.1145/3366423.3380027
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous Graph Transformer

Abstract: Recent years have witnessed the emerging success of graph neural networks (GNNs) for modeling structured data. However, most GNNs are designed for homogeneous graphs, in which all nodes and edges belong to the same types, making them infeasible to represent heterogeneous structures. In this paper, we present the Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs. To model heterogeneity, we design node-and edge-type dependent parameters to characterize the heterogeneo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
541
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 783 publications
(542 citation statements)
references
References 17 publications
0
541
0
1
Order By: Relevance
“…Each paper is labeled with a set of research topics/fields (e.g., Physics and Medicine) and the publication date ranges from 1900 to 2019. We consider the prediction of Paper-Field, Paper-Venue, and Author Name Disambiguation (Author ND) as three downstream tasks [7,15]. The performance is evaluated by MRR-a widely adopted ranking metric [19].…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…Each paper is labeled with a set of research topics/fields (e.g., Physics and Medicine) and the publication date ranges from 1900 to 2019. We consider the prediction of Paper-Field, Paper-Venue, and Author Name Disambiguation (Author ND) as three downstream tasks [7,15]. The performance is evaluated by MRR-a widely adopted ranking metric [19].…”
Section: Methodsmentioning
confidence: 99%
“…The base GNN model. On the OAG and Amazon datasets, we use the state-of-the-art heterogeneous GNN-Heterogeneous Graph Transformer (HGT) [15]-as the base model for GPT-GNN. Furthermore, we also use other (heterogeneous) GNNs as the base model to test our generative pre-training framework.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations