2021
DOI: 10.1609/aaai.v35i12.17315
|View full text |Cite
|
Sign up to set email alerts
|

Data Augmentation for Graph Neural Networks

Abstract: Data augmentation has been widely used to improve generalizability of machine learning models. However, comparatively little work studies data augmentation for graphs. This is largely due to the complex, non-Euclidean structure of graphs, which limits possible manipulation operations. Augmentation operations commonly used in vision and language have no analogs for graphs. Our work studies graph data augmentation for graph neural networks (GNNs) in the context of improving semi-supervised node-classification… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 139 publications
(39 citation statements)
references
References 57 publications
0
39
0
Order By: Relevance
“…All the baselines and our proposed method can be applied to all types of networks. We adopt the same data split of Cora and Citeseer as [ 21 ], and a split of training, validation, and testing with a ratio of 10:20:70 on other datasets [ 12 ].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…All the baselines and our proposed method can be applied to all types of networks. We adopt the same data split of Cora and Citeseer as [ 21 ], and a split of training, validation, and testing with a ratio of 10:20:70 on other datasets [ 12 ].…”
Section: Methodsmentioning
confidence: 99%
“…NeuralSparse [ 11 ] considers the graph sparsification task by removing irrelevant edges. GAUG [ 12 ] utilizes a GNN to parameterize the categorical distribution instead of MLP in NerualSparse. PTDNet [ 13 ] prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, only a few works consider graph data augmentation. [60] note that a node classification task can be perfectly solved if edges only exist between same class samples. They increase homophily by adding edges between nodes that a neural network predicts belong to the same class and breaking edges between nodes of predicted dissimilar classes.…”
Section: F Dataset Generation and Experimental Detailsmentioning
confidence: 99%
“…Robinson et al [20] propose a way to select hard negative samples based on the embedding space distances, and use it to obtain high-quality graph embedding. There are also many works [21], [22] systemically studying the data augmentation on the graphs.…”
Section: Introductionmentioning
confidence: 99%