The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2022
DOI: 10.1016/j.health.2022.100078
|View full text |Cite
|
Sign up to set email alerts
|

A review on Natural Language Processing Models for COVID-19 research

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 62 publications
0
8
0
Order By: Relevance
“…They found that the single-language GREEK-BERT model they trained is better than the M-BERT model and XLM-R model that are suitable for multiple languages. In their research, Hall et al [ 42 ] conducted an extensive review of NLP models and their applications in the context of COVID-19 research. Their focus was primarily on transformer-based biomedical pretrained language models (T-BPLMs) and the sentiment analysis related to COVID-19 vaccination.…”
Section: Review Of Related Workmentioning
confidence: 99%
“…They found that the single-language GREEK-BERT model they trained is better than the M-BERT model and XLM-R model that are suitable for multiple languages. In their research, Hall et al [ 42 ] conducted an extensive review of NLP models and their applications in the context of COVID-19 research. Their focus was primarily on transformer-based biomedical pretrained language models (T-BPLMs) and the sentiment analysis related to COVID-19 vaccination.…”
Section: Review Of Related Workmentioning
confidence: 99%
“…A considerable amount of information is enough for analysis by the human investigator. However, the bulk amount of data from various healthcare and scientific research units is different from human intervention ( Hall, Chang & Jayne, 2022 ). The role of NLP applications in such situations is very prompt and extensive.…”
Section: Nlp For Electronic Health Recordsmentioning
confidence: 99%
“…The transformer encoder generates a sequence of hidden states from the input sequence, which is subsequently passed to the decoder [18]. The output sequence is generated by the decoder by paying attention to both the encoder states and its prior outputs.…”
Section: Transformer Encodermentioning
confidence: 99%