Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining 2022
DOI: 10.1145/3534678.3539426
|View full text |Cite
|
Sign up to set email alerts
|

KPGT

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(19 citation statements)
references
References 19 publications
0
19
0
Order By: Relevance
“…It was found that initializing the parameters of TranSiGen using perturbational profiles by gene knockdown leads to better performance compared to random initialization. Additionally, the pre-training representation using Knowledge-guided Pretraining of Graph Transformer (KPGT) 17 further enhances the performance of inferring DEGs, surpassing the molecular fingerprint ECFP4 (Supplementary Table 1).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…It was found that initializing the parameters of TranSiGen using perturbational profiles by gene knockdown leads to better performance compared to random initialization. Additionally, the pre-training representation using Knowledge-guided Pretraining of Graph Transformer (KPGT) 17 further enhances the performance of inferring DEGs, surpassing the molecular fingerprint ECFP4 (Supplementary Table 1).…”
Section: Resultsmentioning
confidence: 99%
“…Considering that the current number of compounds with experimentally measured gene expression profiles is still limited compared to the vast chemical space, TranSiGen utilized the pre-trained molecular representation KPGT 17 for compounds. KPGT is a novel self-supervised learning framework for molecular graph representation.…”
Section: Methodsmentioning
confidence: 99%
“…Last but not least, self‐supervised learning has gained interest in predicting properties of organic molecules, [76] drugs, [77] proteins, [77a,78] and polymers [79] . Although use in electrochemical material discovery is just beginning since pre‐trained models can be trained with free unlabeled data and reused for various chemistry‐related tasks, it could help in the future design of useful materials.…”
Section: Discussionmentioning
confidence: 99%
“…Based on the prominent results of ActFound on all settings we have evaluated, we hypothesize that it can be further improved by incorporating other pre-trained methods. We plan to incorporate pre-trained methods, such as KPGT, 48 GROVER 49 and MOLFORMER 50 to further improve ActFound.…”
Section: Discussionmentioning
confidence: 99%