2020
DOI: 10.1109/access.2020.2973319
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning for Arabic Named Entity Recognition With Deep Neural Networks

Abstract: The vast amount of unstructured data spread on a daily basis rises the need for developing effective information retrieval and extraction methods. Named Entity Recognition is a challenging classification task for structuring data into pre-defined labels, and is even more complicated when being applied on the Arabic language due to its special traits and complex nature. This article presents a novel Deep Learning approach for Standard Arabic Named Entity Recognition that proved its out-performance when being co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 41 publications
(20 citation statements)
references
References 39 publications
0
20
0
Order By: Relevance
“…Ali and Tan [64] used seq2seq model with BLSTM as an encoder and decoder model for MSA NER; their model outperformed the BGRU-CRF model by [61] and the BLSTM-CRF model by [57]. A recent study [65] applied transfer learning with deep neural networks to build a Pooled-GRU model for MSA NER. Their model outperformed the BLSTM-CRF model proposed by [66].…”
Section: Related Workmentioning
confidence: 99%
“…Ali and Tan [64] used seq2seq model with BLSTM as an encoder and decoder model for MSA NER; their model outperformed the BGRU-CRF model by [61] and the BLSTM-CRF model by [57]. A recent study [65] applied transfer learning with deep neural networks to build a Pooled-GRU model for MSA NER. Their model outperformed the BLSTM-CRF model proposed by [66].…”
Section: Related Workmentioning
confidence: 99%
“…Language models and transfer learning have recently significantly contributed to this field. Transfer learning increases the performances of CR applications, as seen in [25,26], requiring a smaller labelled dataset than training from scratch. Furthermore, the contextualised representation contributes to recognising and differentiating concepts based on their context, thus increasing the accuracy of the model predictions.…”
Section: Concept Recognition For Space Systemsmentioning
confidence: 99%
“…More modern neural network architectures based on various combinations of convolutional and recurrent networks show the best results in solving many problems [31][32][33][34]. These models show significant versatility because they can be applied to multiple languages with unified network architecture.…”
Section: ♦ Improving the Means Of Monitoring Open Sources;mentioning
confidence: 99%