2022
DOI: 10.59200/iconic.2022.016
|View full text |Cite
|
Sign up to set email alerts
|

The Analysis of a GPT-based Sepedi Text Generation Model

Abstract: Text generation is defined as a component of natural language processing that makes use of computational linguistics techniques to produce text that cannot be distinguished from human written text. This study aims to develop and analyse a Generative Pre-Trained Transformer 2 (GPT-2) language model to generate Sepedi phrases. The under-resourced Sepedi language is regarded as a disjunctive language. The Sepedi language orthographic representation presents challenges and has limited resources. The GPT-2… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 17 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?