2020
DOI: 10.48550/arxiv.2009.02027
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Rethinking Graph Regularization for Graph Neural Networks

Abstract: The graph Laplacian regularization term is usually used in semi-supervised node classification to provide graph structure information for a model f (X). However, with the recent popularity of graph neural networks (GNNs), directly encoding graph structure A into a model, i.e., f (A, X), has become the more common approach. While we show that graph Laplacian regularization f (X) ∆f (X) brings little-to-no benefit to existing GNNs, we propose a simple but non-trivial variant of graph Laplacian regularization, ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…It softens the onehot hard targets 𝑦 𝑐 = 1, 𝑦 𝑖 = 0 ∀𝑖 ≠ 𝑐 into 𝑦 𝐿𝑆 𝑖 = (1 − 𝛼)𝑦 𝑖 + 𝛼/𝐶, where 𝑐 is the correct label and 𝐶 is the number of classes. P-reg [44] is proposed as a variant of global graph Laplacian regularization to improve the GNNs with label information. It is defined as…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…It softens the onehot hard targets 𝑦 𝑐 = 1, 𝑦 𝑖 = 0 ∀𝑖 ≠ 𝑐 into 𝑦 𝐿𝑆 𝑖 = (1 − 𝛼)𝑦 𝑖 + 𝛼/𝐶, where 𝑐 is the correct label and 𝐶 is the number of classes. P-reg [44] is proposed as a variant of global graph Laplacian regularization to improve the GNNs with label information. It is defined as…”
Section: Resultsmentioning
confidence: 99%
“…Graph Laplacian regularization [2,51] helps to learn the node embeddings given the original graph. Han et al [44] propose…”
Section: Learning Graph Structure and Spreading Labelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Jiang et al [94] explore the way to do graph construction based on GCN. Yang et al [95] combine the classic graph regularization methods with GCN. Abu et al [96] present a novel N-GCN which marries the random walk with GCN, and a follow-up work GIL [97] with similar ideas is proposed as well.…”
Section: Generalized Aggregation Operationmentioning
confidence: 99%
“…Some have suggested to constrain the parameter space of GNN so that the trained GNN becomes a non-expansive map, thus producing the fixed point [17,36]. Others have proposed to apply an additional GNN layer on the embedded graph and penalize the difference between the output of the additional GNN layer and the embedded graph to guide the GNN to find the fixed points [41]. It has been shown that regularizing GNN to find its fixed points improves the predictive performance of GNN [36,41].…”
Section: Related Workmentioning
confidence: 99%