2020 IEEE International Conference on Data Mining (ICDM) 2020
DOI: 10.1109/icdm50108.2020.00031
|View full text |Cite
|
Sign up to set email alerts
|

Sub-Graph Contrast for Scalable Self-Supervised Graph Representation Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
55
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 91 publications
(56 citation statements)
references
References 12 publications
0
55
0
1
Order By: Relevance
“…To this end, we propose to augment those short sequences with pseudo-prior items. Intuitively, we append a fabricated sub-sequence of items at the beginning of short sequences, which provide additional contexts [3,29]. To generate these pseudo-prior items, we pre-train a transformer from a reverse (i.e., right-to-left) direction of the original sequence to predict the prior item.…”
mentioning
confidence: 99%
“…To this end, we propose to augment those short sequences with pseudo-prior items. Intuitively, we append a fabricated sub-sequence of items at the beginning of short sequences, which provide additional contexts [3,29]. To generate these pseudo-prior items, we pre-train a transformer from a reverse (i.e., right-to-left) direction of the original sequence to predict the prior item.…”
mentioning
confidence: 99%
“…After 20 runs of training, we provide the mean classification accuracy on the test dataset of our approach, and we reuse the metrics from Kipf & Welling [1] for the performance of DeepWalk, as well as Label Propagation (LP) [9] and Planetoid [10]. We also reuse the metrics already in Jiao & Xiong [12] [11]. For all the there datasets, we set output dimension to 200 for our method.…”
Section: Node Classificationmentioning
confidence: 99%
“…One type can be used for self-supervised learning. SUBG-CON [11] exploits the strong correlation between the central graph node and its sampled subgraphs to capture regional structure information. SUBG-CON is a self-supervised representation learning method based on subgraph contrast.…”
Section: Introductionmentioning
confidence: 99%