2022
DOI: 10.1007/s10489-022-04222-8
|View full text |Cite
|
Sign up to set email alerts
|

Multi-constraints in deep graph convolutional networks with initial residual

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 11 publications
(18 citation statements)
references
References 27 publications
0
18
0
Order By: Relevance
“…APPNP [31], JK [32], Geom‐GCN [16], SimP‐GCN [45], and CPGNN [18] aim to improve the feature propagation scheme within layers of the model. More recently, researchers are proposing to make GNN models deeper [27, 29, 30]. However, deeper models suffer from over‐smoothing, where after stacking many GNN layers, features of the node become indistinguishable from each other, and there is a drop in the performance of the model.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…APPNP [31], JK [32], Geom‐GCN [16], SimP‐GCN [45], and CPGNN [18] aim to improve the feature propagation scheme within layers of the model. More recently, researchers are proposing to make GNN models deeper [27, 29, 30]. However, deeper models suffer from over‐smoothing, where after stacking many GNN layers, features of the node become indistinguishable from each other, and there is a drop in the performance of the model.…”
Section: Related Workmentioning
confidence: 99%
“…DropEdge [46] proposes to drop a certain number of edges to reduce the speed of convergence of over‐smoothing and relieves the information loss. GCNII [27] use residual connections and identity mapping in GNN layers to enable deeper networks. RevGNN [29] uses deep reversible architectures and [30] uses noise regularisation to train deep GNN models.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations