2024
DOI: 10.1007/s10462-023-10677-z
|View full text |Cite
|
Sign up to set email alerts
|

Transformers in health: a systematic review on architectures for longitudinal data analysis

Clauirton A. Siebra,
Mascha Kurpicz-Briki,
Katarzyna Wac

Abstract: Transformers are state-of-the-art technology to support diverse Natural Language Processing (NLP) tasks, such as language translation and word/sentence predictions. The main advantage of transformers is their ability to obtain high accuracies when processing long sequences since they avoid the vanishing gradient problem and use the attention mechanism to maintain the focus on the information that matters. These features are fostering the use of transformers in other domains beyond NLP. This paper employs a sys… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 52 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?