Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/267
|View full text |Cite
|
Sign up to set email alerts
|

Graph Convolutional Networks using Heat Kernel for Semi-supervised Learning

Abstract: Graph convolutional networks gain remarkable success in semi-supervised learning on graph-structured data. The key to graph-based semisupervised learning is capturing the smoothness of labels or features over nodes exerted by graph structure. Previous methods, spectral methods and spatial methods, devote to defining graph convolution as a weighted average over neighboring nodes, and then learn graph convolution kernels to leverage the smoothness to improve the performance of graph-based semi-supervised learnin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
64
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 104 publications
(84 citation statements)
references
References 2 publications
0
64
0
Order By: Relevance
“…4 is tuned over {1, 2, 5, 10} through cross validation, and the same k value is adopted across all experiments on the same dataset. Although GCN has been widely-utilized in unsupervised [59][60][61][62] and semi-supervised 58,[63][64][65] learning, in this paper, we further extend the use of GCN for supervised classification tasks. For training data X X X tr ∈ R n tr ×d , the corresponding adjacency matrix…”
Section: Gcn For Omic-specific Learningmentioning
confidence: 99%
“…4 is tuned over {1, 2, 5, 10} through cross validation, and the same k value is adopted across all experiments on the same dataset. Although GCN has been widely-utilized in unsupervised [59][60][61][62] and semi-supervised 58,[63][64][65] learning, in this paper, we further extend the use of GCN for supervised classification tasks. For training data X X X tr ∈ R n tr ×d , the corresponding adjacency matrix…”
Section: Gcn For Omic-specific Learningmentioning
confidence: 99%
“…We propose a novel graph convolution model based on heat kernel to enhance low-frequency signals and suppress high-frequency signals, which can capture the smoothness of node features or labels sufficiently [13]. The heat kernel is defined as…”
Section: Graph Convolution Using Heat Kernelmentioning
confidence: 99%
“…Figure 1 depicted by the Graph Signal Processing toolbox [14] illustrates that the range of heat diffusion controlled by the scaling parameter s becomes larger as s increases. Different from the graph convolution methods in Section 2.2, which constrain neighboring nodes via the shortest path distance, graph convolution based on heat kernel leverages a continuous manner to determine the neighborhood, which can utilize high-order neighbor nodes sufficiently and discard some irrelevant low-order neighbor nodes [13]. Compared with the linear frequency response function g λ q = 1 − 1 2 λ q proposed in AGC [7], we leverage the exponential frequency function g λ q = 1 + e −sλ q based on heat kernel, which highlights low-frequency signals exponentially to make the graph smoother.…”
Section: Graph Convolution Using Heat Kernelmentioning
confidence: 99%
“…(ii) DeepM4L not only guarantees the label consistency between the bag and instance levels; it also pursues the label consistency of bags across views by fusing multiview bag-label score tensors. (iii) Experimental results on benchmark datasets show that DeepM4L outperforms the representative M3L solutions (M3Lcmf [3], M2IL [20] and ICM2L [21] ), data fusion solutions (MFDF [8] , SelDFMF [10] and M4L-JMF [15]) based on matrix factorization, and network embedding methods (metapath2vec [22] and GraphHeat [23]) for diverse M4L tasks.…”
Section: Introductionmentioning
confidence: 99%