Proceedings of the 2nd International Conference on Intelligent and Innovative Computing Applications 2020
DOI: 10.1145/3415088.3415114
|View full text |Cite
|
Sign up to set email alerts
|

The development of a sepedi text generation model using long-short term memory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…However, they are robust when generating local coherent text. Therefore, several papers such as [6]- [8], [71]- [76] discuss various deep learning RNN algorithms in light to this shortcoming. Improved versions of RNN, namely, LSTM and GRU are implemented in an attempt to solve long-term dependencies in text.…”
Section: Machine Learning Algorithms For Text Generationmentioning
confidence: 99%
“…However, they are robust when generating local coherent text. Therefore, several papers such as [6]- [8], [71]- [76] discuss various deep learning RNN algorithms in light to this shortcoming. Improved versions of RNN, namely, LSTM and GRU are implemented in an attempt to solve long-term dependencies in text.…”
Section: Machine Learning Algorithms For Text Generationmentioning
confidence: 99%
“…Different text generation models have been developed for many resourced languages (Islam et al, 2019). There are few models for most under-resourced languages such as Sepedi (Moila & Modipa, 2020). A model that generates text that cannot be differentiated from human-generated or machine-generated text would be useful for text corpus creation.…”
Section: Introductionmentioning
confidence: 99%