2023
DOI: 10.3390/electronics12030569
|View full text |Cite
|
Sign up to set email alerts
|

Robust Chinese Named Entity Recognition Based on Fusion Graph Embedding

Abstract: Named entity recognition is an important basic task in the field of natural language processing. The current mainstream named entity recognition methods are mainly based on the deep neural network model. The vulnerability of the deep neural network itself leads to a significant decline in the accuracy of named entity recognition when there is adversarial text in the text. In order to improve the robustness of named entity recognition under adversarial conditions, this paper proposes a Chinese named entity reco… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 23 publications
0
0
0
Order By: Relevance
“…This notably improves various metrics for natural language processing (NLP) tasks. In the field of NER, significant achievements have been made by methods based on fine-tuning pretrained models, like BERT [19,20], BERT-CRF [21,22], and BERT-BiLSTM-CRF [23]. Pretrained models can learn contextual features of text on the basis of training on large-sample data, enabling fine-tuning in situations with limited annotated data and subsequently learning contextual features for downstream tasks.…”
Section: Introductionmentioning
confidence: 99%
“…This notably improves various metrics for natural language processing (NLP) tasks. In the field of NER, significant achievements have been made by methods based on fine-tuning pretrained models, like BERT [19,20], BERT-CRF [21,22], and BERT-BiLSTM-CRF [23]. Pretrained models can learn contextual features of text on the basis of training on large-sample data, enabling fine-tuning in situations with limited annotated data and subsequently learning contextual features for downstream tasks.…”
Section: Introductionmentioning
confidence: 99%