2024
DOI: 10.3390/info15020099
|View full text |Cite
|
Sign up to set email alerts
|

Generative Pre-Trained Transformer (GPT) in Research: A Systematic Review on Data Augmentation

Fahim Sufi

Abstract: GPT (Generative Pre-trained Transformer) represents advanced language models that have significantly reshaped the academic writing landscape. These sophisticated language models offer invaluable support throughout all phases of research work, facilitating idea generation, enhancing drafting processes, and overcoming challenges like writer’s block. Their capabilities extend beyond conventional applications, contributing to critical analysis, data augmentation, and research design, thereby elevating the efficien… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 73 publications
0
3
0
Order By: Relevance
“…Casey and Michal K. ( 2023) is a review study on GPT models. Also, (Sufi, 2024) reviewed the application of GPTs to research capturing data augmentation and synthetic data generation. Several efforts are ongoing to improve the performance of transformer models in terms of efficiency, robustness, interpretability and adaptability (OpenAI, 2018).…”
Section: Recent Advancements and Future Directionsmentioning
confidence: 99%
“…Casey and Michal K. ( 2023) is a review study on GPT models. Also, (Sufi, 2024) reviewed the application of GPTs to research capturing data augmentation and synthetic data generation. Several efforts are ongoing to improve the performance of transformer models in terms of efficiency, robustness, interpretability and adaptability (OpenAI, 2018).…”
Section: Recent Advancements and Future Directionsmentioning
confidence: 99%
“…This meticulous review and analysis facilitated the development of a novel classification scheme, categorizing the aforementioned 53 publications into 10 distinct categories. Additionally, it is noteworthy that beyond the creation of this innovative classification scheme for existing literature concerning the utilization of transformer technology in social-media-based disaster analytics, this study presents a methodological and systematic review of the subject of employing contemporary AI-based tools such as Litmaps [16,17].…”
Section: Introductionmentioning
confidence: 99%
“…The recent success of Generative Pre-Trained Transformer (GPT) models in few-shot tasks [8][9][10], pre-trained on vast datasets, has inspired the development of the Segment Anything Model (SAM), a model trained extensively on data to encode and decode feature for segmentation, exhibiting remarkable few-shot and even zero-shot capabilities [11]. SAM-Med2D [12] bridges the gap between SAM's proficiency in natural images and its application in medical 2D image analysis.…”
Section: Introductionmentioning
confidence: 99%