2020
DOI: 10.1609/aaai.v34i03.5673
|View full text |Cite
|
Sign up to set email alerts
|

Going Deep: Graph Convolutional Ladder-Shape Networks

Abstract: Neighborhood aggregation algorithms like spectral graph convolutional networks (GCNs) formulate graph convolutions as a symmetric Laplacian smoothing operation to aggregate the feature information of one node with that of its neighbors. While they have achieved great success in semi-supervised node classification on graphs, current approaches suffer from the over-smoothing problem when the depth of the neural networks increases, which always leads to a noticeable degradation of performance. To solve this probl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 19 publications
(5 citation statements)
references
References 22 publications
(18 reference statements)
0
5
0
Order By: Relevance
“…However, these models lead to an over-smoothing problem in community detection. To reduce such negative impact, Graph Convolutional Laddershape Networks (GCLN) [75] is designed as a new GCN architecture for unsupervised community detection (k-means), which is based on the U-Net in the CNN field. A contracting path and an expanding path are symmetrically built in GCLN.…”
Section: B Gcn-based Community Detectionmentioning
confidence: 99%
“…However, these models lead to an over-smoothing problem in community detection. To reduce such negative impact, Graph Convolutional Laddershape Networks (GCLN) [75] is designed as a new GCN architecture for unsupervised community detection (k-means), which is based on the U-Net in the CNN field. A contracting path and an expanding path are symmetrically built in GCLN.…”
Section: B Gcn-based Community Detectionmentioning
confidence: 99%
“…The proposed strategies try to keep the number of layers to relatively small numbers (up to three, some examples are [48,49,57,60]). The ladder-shaped architecture proposed by [65] et al is another idea to overcome this issue. 5.…”
Section: Performance Degrades As Graph Convolutional Layers Are Addedmentioning
confidence: 99%
“…To realize the convolution on graphs, there are two types of GCNs. Spectral-based GCNs [7,13,23,20] are based on the convolution theorem to perform convolution by transforming graph signals into the spectral domain via the graph Fourier transform. Spatialbased GCNs [18,19,39,31,40,44] update node representations by aggregating the message from its neighbor nodes, just like applying convolutional kernel on a local image patch.…”
Section: Related Workmentioning
confidence: 99%