2020
DOI: 10.1609/aaai.v34i04.6048
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Stage Self-Supervised Learning for Graph Convolutional Networks on Graphs with Few Labeled Nodes

Abstract: Graph Convolutional Networks (GCNs) play a crucial role in graph learning tasks, however, learning graph embedding with few supervised signals is still a difficult problem. In this paper, we propose a novel training algorithm for Graph Convolutional Network, called Multi-Stage Self-Supervised (M3S) Training Algorithm, combined with self-supervised learning approach, focusing on improving the generalization performance of GCNs on graphs with few labeled nodes. Firstly, a Multi-Stage Training Framework is provid… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
107
0
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 188 publications
(108 citation statements)
references
References 19 publications
0
107
0
1
Order By: Relevance
“…6.1.1 Datasets. For a fair comparison, we adopt same benchmark datasets used by Sun et al [24] and Li et al [17], including Cora, Citeseer, Pubmed [22]. Each dataset contains a citation graph, where nodes represent articles/papers and edges denote citation correlation.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…6.1.1 Datasets. For a fair comparison, we adopt same benchmark datasets used by Sun et al [24] and Li et al [17], including Cora, Citeseer, Pubmed [22]. Each dataset contains a citation graph, where nodes represent articles/papers and edges denote citation correlation.…”
Section: Methodsmentioning
confidence: 99%
“…For example, DCNN [2] combines graph convolutional operator with the diffusion process and Veličković et al proposes the graph attention network [28] with the self-attention mechanism on the neighbors of the node and assign different weights accordingly during the aggregation process. Of all these GCNNs, GNNs [15] are highly favorable by the computer science community [17,24]…”
Section: Graph Convolutional Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations