2021
DOI: 10.48550/arxiv.2105.09111
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Self-supervised Heterogeneous Graph Neural Network with Co-contrastive Learning

Abstract: Heterogeneous graph neural networks (HGNNs) as an emerging technique have shown superior capacity of dealing with heterogeneous information network (HIN). However, most HGNNs follow a semi-supervised learning manner, which notably limits their wide use in reality since labels are usually scarce in real applications. Recently, contrastive learning, a self-supervised method, becomes one of the most exciting learning paradigms and shows great potential when there are no labels. In this paper, we study the problem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(19 citation statements)
references
References 29 publications
(31 reference statements)
0
19
0
Order By: Relevance
“…HeCo [32]: A self-supervised heterogeneous graph neural network. The node representation of the heterogeneous network is learned from two perspectives, namely the network architecture perspective and the metapath perspective, which fully capture the information in the heterogeneous network.…”
Section: Methodsmentioning
confidence: 99%
“…HeCo [32]: A self-supervised heterogeneous graph neural network. The node representation of the heterogeneous network is learned from two perspectives, namely the network architecture perspective and the metapath perspective, which fully capture the information in the heterogeneous network.…”
Section: Methodsmentioning
confidence: 99%
“…For example, in citation networks like ACM, there are three types of nodes (including author, paper, and subject), and there are two types of edges (representing author-write-paper and paper-belongsubject relations). To cluster nodes of a speciic type (e.g., papers), it is a common practice to construct various meta-paths between two papers, such as paper-author-paper (describing two papers written by the same author), and paper-subject-paper (meaning that two papers belong to the same subject) [57,70]. Therefore, heterogeneous graphs can be transformed into multi-relational graphs having homogeneous nodes (papers) and diferent types of edges (meta-paths).…”
Section: Four Types Of Graph Structuresmentioning
confidence: 99%
“…Contrastive Learning methods [34,44,49] learn node representations by contrasting positive pairs against negative pairs. DGI [34] first adopts Infomax [23] in graph representation learning, and focuses on contrasting the local node embeddings with global graph embeddings.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…Similarly, MVGRL [12] learns node-and graph-level node representations from two structural graph views including first-order neighbors and a graph diffusion, and contrasts encoded embeddings between two graph views. More recently, HeCo [44] proposes to learn node representations from network schema view and metapath view, and performs contrastive learning between them. And in traditional collaborative filtering (CF) based recommendation domain, SGL [49] conducts contrastive learning between original graph and corrupted graph on user-item interactions.…”
Section: Contrastive Learningmentioning
confidence: 99%
See 1 more Smart Citation