2021
DOI: 10.1088/1742-6596/1848/1/012101
|View full text |Cite
|
Sign up to set email alerts
|

A BERT based Chinese Named Entity Recognition method on ASEAN News

Abstract: As the first step of building a knowledge graph to record the ASEAN counties’ information, we aim to conduct Named-entity Recognition (NER) on the Chinese news about ASEAN counties. We employ a Bi-directional gated recurrent unit to replace the LSTM architecture to improve both models’ effectiveness and capability in understanding polysemous words. The state-of-the-art word embedding model, BERT, has also been included to generate qualified word vectors for the NER task. Besides, we also propose a similarity-b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 1 publication
0
5
0
Order By: Relevance
“…The machine learning approaches, such as hidden Markov models (HMM), conditional random fields (CRF) and support vector machine (SVM), rely heavily on hand-crafted features (Wang et al, 2015;Yu et al, 2018) while the deep neural network model-based approach, such as convolution neural networks (CNN) and recurrent neural networks (RNN), basically rely on word or character embedding, and could extract features automatically (Dong et al, 2016;Habibi et al, 2017;Liu et al, 2019a, b). In addition, pre-trained Bidirectional Encoder Representations from Transformers (BERT) model has also been applied to entity recognition (Zhuang et al, 2021).…”
Section: Knowledge Elements and Extraction Processmentioning
confidence: 99%
See 1 more Smart Citation
“…The machine learning approaches, such as hidden Markov models (HMM), conditional random fields (CRF) and support vector machine (SVM), rely heavily on hand-crafted features (Wang et al, 2015;Yu et al, 2018) while the deep neural network model-based approach, such as convolution neural networks (CNN) and recurrent neural networks (RNN), basically rely on word or character embedding, and could extract features automatically (Dong et al, 2016;Habibi et al, 2017;Liu et al, 2019a, b). In addition, pre-trained Bidirectional Encoder Representations from Transformers (BERT) model has also been applied to entity recognition (Zhuang et al, 2021).…”
Section: Knowledge Elements and Extraction Processmentioning
confidence: 99%
“…, 2019a, b). In addition, pre-trained Bidirectional Encoder Representations from Transformers (BERT) model has also been applied to entity recognition (Zhuang et al. , 2021).…”
Section: Related Workmentioning
confidence: 99%
“…The greatest feature of sequence Figure 1 shows that the BERT model uses MLM (masked language model) and nextsentence prediction methods to capture sentiment representations at word and sentence levels, respectively [17]. The main structure of the BERT model is stacked by Transformer, and the text encoding of the Transformer is based on the attention mechanism [18]. Therefore, different weights can be assigned to each word according to the relationships evident between the words and sentences [19].…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…The emergence of question and answer (Q&A) systems presents an opportunity to overcome this issue. Named entity recognition (NER) is now widely used in the military ( Wang et al., 2018 ; Lu et al., 2020 ; Baigang and Yi, 2023 ; Li et al., 2023 ), entertainment and culture ( Molina-Villegas et al., 2021 ; Zhuang et al., 2021 ; Fu et al., 2022 ; Huang et al., 2022 ), cybersecurity ( Georgescu et al., 2019 ; Simran et al., 2020 ; Chen et al., 2021 ; Ma et al., 2021 ), and medicine ( Ji et al., 2019 ; Li et al., 2020 ; Wang et al., 2020 ; Liu et al., 2022 ). However, the application of NER in the agricultural sector is still in the early stages of development ( Wang et al., 2022 ; Yu et al., 2022a ; Qian et al., 2023 ).…”
Section: Introductionmentioning
confidence: 99%