2023
DOI: 10.1101/2023.01.07.523121
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Protein language model embedded geometric graphs power inter-protein contact prediction

Abstract: Accurate prediction of contacting residue pairs between interacting proteins is very useful for structural characterization of protein-protein interactions (PPIs). Although significant improvement has been made in inter-protein contact prediction recently, there is still large room for improving the prediction accuracy. Here we present a new deep learning method referred to as PLMGraph-Inter for inter-protein contact prediction. Specifically, we employ rotationally and translationally invariant geometric graph… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 70 publications
0
2
0
Order By: Relevance
“…After the transformative progress brought by deep learning to protein structure prediction [2][3][4][5], predicting protein complex structure and ligand binding sites is fast advancing with AFM and related methods, but also with other deep learning models based on structural representations [73][74][75][76]. Combining the latter [77,78] and, more generally, structural information [79] with the power of sequence-based language models is starting to bring even further progress.…”
Section: Discussionmentioning
confidence: 99%
“…After the transformative progress brought by deep learning to protein structure prediction [2][3][4][5], predicting protein complex structure and ligand binding sites is fast advancing with AFM and related methods, but also with other deep learning models based on structural representations [73][74][75][76]. Combining the latter [77,78] and, more generally, structural information [79] with the power of sequence-based language models is starting to bring even further progress.…”
Section: Discussionmentioning
confidence: 99%
“…This particular architecture was later combined with a generic transformer to create ESM-IF1 [33] which produces embeddings that consist of 512-dimensional vectors for each residue of a protein structure. The ESM-IF1 embeddings have been used for epitope prediction [34] and protein-protein interaction (PPI) prediction [35]. Successes such as these suggest that the embeddings contain task-relevant information relating to the protein function and biochemical activity.…”
Section: Introductionmentioning
confidence: 99%