2023
DOI: 10.48550/arxiv.2302.04725
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Lightweight Transformers for Clinical Natural Language Processing

Abstract: Specialised pre-trained language models are becoming more frequent in NLP since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., 2019) and BioClinicalBERT (Alsentzer et al., 2019) are two examples of such models that have shown promise in medical NLP tasks. Many of these models are overparametrised and resource-intensive, but thanks to techniques like Knowledge Distillation (KD), it is possible to create smaller versions that perform almost as well as their larger count… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 39 publications
(52 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?