2024
DOI: 10.1109/access.2024.3389497
|View full text |Cite
|
Sign up to set email alerts
|

GPT (Generative Pre-Trained Transformer)— A Comprehensive Review on Enabling Technologies, Potential Applications, Emerging Challenges, and Future Directions

Gokul Yenduri,
M. Ramalingam,
G. Chemmalar Selvi
et al.

Abstract: The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the domain of natural language processing, which is propelling us toward the development of machines that can understand and communicate using language in a manner that closely resembles that of humans. GPT is based on the transformer architecture, a deep neural network designed for natural language processing tasks. Due to their impressive performance on natural language processing tasks and ability to effectively converse, GPT … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 21 publications
(5 citation statements)
references
References 178 publications
0
2
0
Order By: Relevance
“…Students can engage in interactive learning, explore their curiosity and develop their language skills. Prompt engineering encourage discussions, provides problem solving scenarios, and enhance the development of metacognitive skills (Yenduri et al, 2023). This also provides personalized feedback and assessment and foster collaborative and interactive learning environments.…”
Section: Discussionmentioning
confidence: 99%
“…Students can engage in interactive learning, explore their curiosity and develop their language skills. Prompt engineering encourage discussions, provides problem solving scenarios, and enhance the development of metacognitive skills (Yenduri et al, 2023). This also provides personalized feedback and assessment and foster collaborative and interactive learning environments.…”
Section: Discussionmentioning
confidence: 99%
“…predicting the next word based on previous words. Thus, the model can learn natural language representations that can be fine-tuned for specific subtasks [24]. Features of the GPT models' versions are given in Table 2.…”
Section: Text Generation Modelsmentioning
confidence: 99%
“…BERT models, on the other hand, are designed to understand the deep contextual relationships between words in a sentence. This bidirectional training makes BERT models ideal for tasks like question answering, sentiment analysis, and natural language understanding [74,75].…”
Section: Foundations Of ML For Chatbotsmentioning
confidence: 99%