2021
DOI: 10.1016/j.cag.2021.06.010
|View full text |Cite
|
Sign up to set email alerts
|

SHREC 2021: Retrieval and classification of protein surfaces equipped with physical and chemical properties

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…In this deep learning method, our group exploits the availability of protein class labels from Ref. [35] to optimize the representation of protein surfaces without any additional properties. Particularly, we designed a message-passing graph convolutional neural network (MPGCNN) with the Edge Convolution (EdgeConv) paradigm [57] for the protein classification objective.…”
Section: Graph-based Learning Methods For Surface-based Protein Domai...mentioning
confidence: 99%
See 2 more Smart Citations
“…In this deep learning method, our group exploits the availability of protein class labels from Ref. [35] to optimize the representation of protein surfaces without any additional properties. Particularly, we designed a message-passing graph convolutional neural network (MPGCNN) with the Edge Convolution (EdgeConv) paradigm [57] for the protein classification objective.…”
Section: Graph-based Learning Methods For Surface-based Protein Domai...mentioning
confidence: 99%
“…The main novelty of the SHREC ′ 21 track is arguably the availability of protein surfaces with electrostatic values, which has been shown to improve the retrieval performance of protein surfaces [11,35]. This additional feature might therefore allow to better distinguish structurally related proteins based on their surficial properties and improve the methods' performance.…”
Section: Comparison To Previous Shrec Datasets On Proteinsmentioning
confidence: 99%
See 1 more Smart Citation
“…We used ADAM for parameter optimization with a binary cross entropy loss function. The learning rate was explored from 1e-3 to 7e-3 and 0.1-0.7 in our previous work and set to 0.005 15 . The accuracy of networks was evaluated on the negative and positive set generated from the 2,541 structures, which totals 167,872 pairs.…”
Section: Deep Neural Network For Fold Classificationmentioning
confidence: 99%
“…For this classification, we considered 1,101 folds in the class a (all  proteins), b (all  proteins), c ( proteins), d ( proteins), and g (small proteins) in the SCOPe database. The neural network takes 3DZDs of two protein structures and outputs the probability that the two structures belong to the same SCOPe fold 15 (Supplementary Figure 1; see Methods). We trained two networks, one that uses 3DZDs computed from full-atom protein surface and another one that takes 3DZDs computed from main-chain C, C, and N atoms 16 .…”
mentioning
confidence: 99%