2024
DOI: 10.1101/2024.04.15.589672
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ProSST: Protein Language Modeling with Quantized Structure and Disentangled Attention

Mingchen Li,
Pan Tan,
Xinzhu Ma
et al.

Abstract: Protein language models have exhibited remarkable representational capabilities in various downstream tasks, notably in the prediction of protein functions. Despite their success, these models traditionally grapple with a critical shortcoming: the absence of explicit protein structure information, which is pivotal for elucidating the relationship between protein sequences and their functionality. Addressing this gap, we introduce DeProt, a Transformer-based protein language model designed to incorporate protei… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 45 publications
0
0
0
Order By: Relevance