2023
DOI: 10.3390/info14040242
|View full text |Cite
|
Sign up to set email alerts
|

Transformers in the Real World: A Survey on NLP Applications

Abstract: The field of Natural Language Processing (NLP) has undergone a significant transformation with the introduction of Transformers. From the first introduction of this technology in 2017, the use of transformers has become widespread and has had a profound impact on the field of NLP. In this survey, we review the open-access and real-world applications of transformers in NLP, specifically focusing on those where text is the primary modality. Our goal is to provide a comprehensive overview of the current state-of-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 36 publications
(12 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…Self-attention, a groundbreaking mechanism for deep learning, has ushered in transformative advancements in NLP and CV [35]. In NLP, large language models like BERT and GPT-4, built on self-attention, have excelled in language tasks due to their ability to capture context and dependencies in text [36]. In CV, the vision transformer architecture and its variants leverage self-attention to process images by dividing them into patches and applying this mechanism to them [37].…”
Section: Discussionmentioning
confidence: 99%
“…Self-attention, a groundbreaking mechanism for deep learning, has ushered in transformative advancements in NLP and CV [35]. In NLP, large language models like BERT and GPT-4, built on self-attention, have excelled in language tasks due to their ability to capture context and dependencies in text [36]. In CV, the vision transformer architecture and its variants leverage self-attention to process images by dividing them into patches and applying this mechanism to them [37].…”
Section: Discussionmentioning
confidence: 99%
“…The advantages of the abstractive approach can remove words that are considered unimportant, while the extractive approach performs summaries based on different phrases in the input data [34]. In the Transformer-based Text-to-Text Transfer Transformer or T5 model, a lot of text summaries are carried out with an abstract approach such as that done by Patwardhan et al [35], Cheng and Yu [36], and Mars [37] which goes through several stages such as tokenization, formation of input-output data, Pretraining and Fine-Tuning, Encoder-Decoder Transformation, and text generation.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, the rest of the case studies are concerned with machine learning and a deep learning-based efficient approach for fake news detection by means of various state-of-the-art approaches. Deep learning has contributed a lot recently in many fields such as pattern analysis and artificial intelligence [31][32][33], with important applications in fake news detection model development [34][35][36][37][38]. However, deep learning has two major disadvantages: the first disadvantage is the overfitting problem most of the time and the second one is that it takes a lot of time to model the underlying data.…”
Section: Machine Learning For Fake News Detection 41 Overviewmentioning
confidence: 99%