2023
DOI: 10.1017/s1351324923000475
|View full text |Cite
|
Sign up to set email alerts
|

Neural Arabic singular-to-plural conversion using a pretrained Character-BERT and a fused transformer

Azzam Radman,
Mohammed Atros,
Rehab Duwairi

Abstract: Morphological re-inflection generation is one of the most challenging tasks in the natural language processing (NLP) domain, especially with morphologically rich, low-resource languages like Arabic. In this research, we investigate the ability of transformer-based models in the singular-to-plural Arabic noun conversion task. We start with pretraining a Character-BERT model on a masked language modeling task using 1,134,950 Arabic words and then adopting the fusion technique to transfer the knowledge gained by … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 38 publications
(39 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?