2022
DOI: 10.1016/j.patcog.2022.108659
|View full text |Cite
|
Sign up to set email alerts
|

Molecular substructure graph attention network for molecular property identification in drug discovery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Self-supervised models includes Mol-CLR [36], GraphCL [37], HierMRL [38], GraphLoG [39], GraphMVP [32], GraphMAE [40], KEMPNN [41], MolPMoFiT [42], MolBERT [43], FP-BERT [27], and SMILES Transforme [15]. Supervised learning models includes D-MPNN [35], DimeNet [44], AttentionFP [45], DLF-MFF [46], and MSSGAT [47].…”
Section: Performance Comparison With Baselinesmentioning
confidence: 99%
“…Self-supervised models includes Mol-CLR [36], GraphCL [37], HierMRL [38], GraphLoG [39], GraphMVP [32], GraphMAE [40], KEMPNN [41], MolPMoFiT [42], MolBERT [43], FP-BERT [27], and SMILES Transforme [15]. Supervised learning models includes D-MPNN [35], DimeNet [44], AttentionFP [45], DLF-MFF [46], and MSSGAT [47].…”
Section: Performance Comparison With Baselinesmentioning
confidence: 99%
“…Furthermore, the Schottky barrier (SB) in the source/drain was used in the design of TFETs to eliminate the doping effect of the S/D area, thus giving the TFETs better detection performance than the FETs. In addition, the introduction of high-k oxide [ 109 ] or charged plasma in TFETs [ 110 ] and the introduction of cavities on the source side for sensing biological molecules using dielectric modulation are helpful to enhance the drain current.…”
Section: Challenges and Opportunitiesmentioning
confidence: 99%