2021
DOI: 10.1007/s10489-021-02617-7
|View full text |Cite
|
Sign up to set email alerts
|

Simplified multilayer graph convolutional networks with dropout

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(6 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…Yang et al. ( 2021b ) employed a simplified multilayer GCN where redundant computation was handled with the removal of nonlinearities and merging weight matrices between graph conventional layers. The method matched the running speed of simple graph convolution (SGC) and outperformed GCN and SGC in five downstream tasks.…”
Section: Results Analysis Per Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…Yang et al. ( 2021b ) employed a simplified multilayer GCN where redundant computation was handled with the removal of nonlinearities and merging weight matrices between graph conventional layers. The method matched the running speed of simple graph convolution (SGC) and outperformed GCN and SGC in five downstream tasks.…”
Section: Results Analysis Per Datasetsmentioning
confidence: 99%
“…Yang et al. ( 2021b ) explored multilayer GCNs to handle complexity and redundant calculations, and the overfitting problem of GCNs. A simplified multilayer GCN with dropout which extends shallow GCNs was applied in scientific texts.…”
Section: Semi-supervised Learning For Text Classificationmentioning
confidence: 99%
“…During each step of model training, this layer randomly sets input units to zero value with a set frequency. Those inputs not set to zero are scaled up by 1/(1-rate), keeping the total sum of all inputs constant [14]. In our CNN architecture, we have used two dropout layers with dropout rates of 25%.…”
Section: Dropout Layermentioning
confidence: 99%
“…With the dropout layer, it is aimed that some neurons have less information about each other. Thanks to the dropout layer, neurons are less affected by each other's weight changes [19].…”
Section: Cnn Architectures Layers and Lstm Networkmentioning
confidence: 99%