2018
DOI: 10.48550/arxiv.1801.07606
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning

Abstract: Many interesting problems in machine learning are being revisited with new deep learning tools. For graph-based semisupervised learning, a recent important development is graph convolutional networks (GCNs), which nicely integrate local vertex features and graph topology in the convolutional layers. Although the GCN model compares favorably with other state-of-the-art methods, its mechanisms are not clear and it still requires considerable amount of labeled data for validation and model selection. In this pape… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
68
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 68 publications
(74 citation statements)
references
References 2 publications
(1 reference statement)
2
68
0
Order By: Relevance
“…In other words, here in Section 3.1, we ignore the nonlinear activation and bias parameters. Such setup is consistent with many existing literature such as [30,33,34,60]. Proposition 3.1.…”
Section: Expressivity Analysis On Shadow-gcn: Graph Signal Processing...supporting
confidence: 90%
See 2 more Smart Citations
“…In other words, here in Section 3.1, we ignore the nonlinear activation and bias parameters. Such setup is consistent with many existing literature such as [30,33,34,60]. Proposition 3.1.…”
Section: Expressivity Analysis On Shadow-gcn: Graph Signal Processing...supporting
confidence: 90%
“…GCNs [22] suffer from "oversmoothing" [30] -Each GCN layer smooths the features of the direct (i.e., 1-hop) neighbors, and many GCN layers smooths the features of the full graph. Eventually, such repeated smoothing process propagates to any target node just the averaged feature of all V. "Oversmoothing" thus incurs significant information loss by wiping out all local information.…”
Section: Expressivity Analysis On Shadow-gcn: Graph Signal Processing...mentioning
confidence: 99%
See 1 more Smart Citation
“…First, we theoretically show that with standard degree normalization, low-degree neighbors surprisingly have more influence on a node's representation than higher-degree neighbors. Second, we discuss the results showing the dependency of GNNs on links within graph communities [10,14], which are ubiquitous in real-world graphs. Based on that, we argue that adversarial edges should link nodes with longer paths between them.…”
Section: Gnn Gnn Gnnmentioning
confidence: 99%
“…Communities are densely connected subgraphs, and are very common in empirical networks, e.g., social networks and co-author networks. The GNN update rule (Equation 1) is a special form of Laplacian smoothing [14]. Crucially, GNNs assume that nodes within the same community tend to share the same label and have similar features [10,14].…”
Section: Impact Of Degree and Distancementioning
confidence: 99%