2021
DOI: 10.48550/arxiv.2103.13355
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Bag of Tricks for Node Classification with Graph Neural Networks

Yangkun Wang,
Jiarui Jin,
Weinan Zhang
et al.

Abstract: Much of the recent progress made in node classification on graphs can be credited to the careful design on graph neural networks (GNN) and label propagation algorithms. However, in the literature, in addition to improvements to the model architecture, there are a number of improvements either briefly mentioned as implementation details or visible only in source code, and these overlooked techniques may play a pivotal role in their practical use. In this paper, we first summarize a collection of existing refine… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(18 citation statements)
references
References 18 publications
0
18
0
Order By: Relevance
“…In this paper, we combine the advantages from both search-based and time-aware models to efficiently retrieve relevant items and mine sequential patterns in an end-to-end way. Our paper is also related to the label trick proposed in [38,42] based graph structure. Instead, our work focuses on the label usage in the sequence cases, which, notably, is also different from the masking technique in existing sequential models such as BERT [7] performing on the feature dimension instead of the label dimension.…”
Section: Preliminaries 21 Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In this paper, we combine the advantages from both search-based and time-aware models to efficiently retrieve relevant items and mine sequential patterns in an end-to-end way. Our paper is also related to the label trick proposed in [38,42] based graph structure. Instead, our work focuses on the label usage in the sequence cases, which, notably, is also different from the masking technique in existing sequential models such as BERT [7] performing on the feature dimension instead of the label dimension.…”
Section: Preliminaries 21 Related Workmentioning
confidence: 99%
“…The principal way to use the user historical feedbacks is to treat this feedbacks as the label to supervise the model. However, as discussed in [38,42], combining the information from both label and feature as the input to train the model can significantly improve its performance.…”
Section: Introductionmentioning
confidence: 99%
“…GAT-FLAG GAT with FLAG [15] enhancement. GAT+BoT GAT with bag of tricks [32]. AGDN+BoT AGDN with bag of tricks.…”
Section: Gnn Model Descriptionmentioning
confidence: 99%
“…But as the baseline performance of reddit dataset is quite high, not surprisingly, the overall improvement of NGNN is not significant. We further analysis the performance of NGNN combined with bag of tricks [32] on ogbn-arxiv and ogbn-proteins in Table 4. It can Table 5: Performance of NGNN on ogbl-collab, ogbl-ppa and ogbl-ppi.…”
Section: Node Classificationmentioning
confidence: 99%
See 1 more Smart Citation