2022 IEEE International Conference on Data Mining (ICDM) 2022
DOI: 10.1109/icdm54844.2022.00090
|View full text |Cite
|
Sign up to set email alerts
|

Unifying Graph Contrastive Learning with Flexible Contextual Scopes

Abstract: Graph contrastive learning (GCL) has recently emerged as an effective learning paradigm to alleviate the reliance on labelling information for graph representation learning. The core of GCL is to maximise the mutual information between the representation of a node and its contextual representation (i.e., the corresponding instance with similar semantic information) summarised from the contextual scope (e.g., the whole graph or 1hop neighbourhood). This scheme distils valuable self-supervision signals for GCL t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(15 citation statements)
references
References 30 publications
0
15
0
Order By: Relevance
“…Baselines. We compare GraphMAE2 with state-of-the-art selfsupervised graph learning methods, including contrastive methods, GRACE [57], BGRL [39], CCA-SSG [54], and GGD [55] as well as a generative method GraphMAE [18]. Other methods are not compared because they are not scalable to large graphs, e.g., MVGRL [14], or the source code has not been released, e.g., In-foGCL [48].…”
Section: Evaluating On Large-scale Datasetsmentioning
confidence: 99%
See 2 more Smart Citations
“…Baselines. We compare GraphMAE2 with state-of-the-art selfsupervised graph learning methods, including contrastive methods, GRACE [57], BGRL [39], CCA-SSG [54], and GGD [55] as well as a generative method GraphMAE [18]. Other methods are not compared because they are not scalable to large graphs, e.g., MVGRL [14], or the source code has not been released, e.g., In-foGCL [48].…”
Section: Evaluating On Large-scale Datasetsmentioning
confidence: 99%
“…Contrastive methods. Contrastive learning is an important way to learn representations in a self-supervised manner and has achieved successful practices in graph learning [14,29,35,37,42,51,55]. DGI [42] and InfoGraph [37] adopt the local-global mutual information maximization to learn node-level and graph-level representations.…”
Section: Graph Self-supervised Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…f N is the initialized nodes representations and consists of entity embedding and level encoding. Previous works initialized representations randomly, which limits the effectiveness seriously [49]. Moreover, some additional nodes appear few times during the training and hard to find the best embeddings.…”
Section: Dynamic Graphmentioning
confidence: 99%
“…The improvement comes from two aspects. Firstly, it provides well pretrained parameters; Secondly, its tokenizer encodes each token (medical terminology) with a well pretrained embedding, which avoid Graph Transformers to propagate similar hidden states [49].…”
Section: Choice Of Parameter Initializationmentioning
confidence: 99%