2023
DOI: 10.1016/j.eswa.2023.121103
|View full text |Cite
|
Sign up to set email alerts
|

Naming entity recognition of citrus pests and diseases based on the BERT-BiLSTM-CRF model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…Chang et al [38] concatenated CRF with pretrained BERT models for NER tasks. Liu et al [39] proposed the BERT-BiLSTM-CRF recognition method and applied it to research on citrus pests and diseases. Gan et al [40] proposed the BERT-Transformer-BiLSTM-CRF model to handle the challenges posed by pronouns and polysemous words in Chinese NER.…”
Section: Methods Based On Deep Learningmentioning
confidence: 99%
“…Chang et al [38] concatenated CRF with pretrained BERT models for NER tasks. Liu et al [39] proposed the BERT-BiLSTM-CRF recognition method and applied it to research on citrus pests and diseases. Gan et al [40] proposed the BERT-Transformer-BiLSTM-CRF model to handle the challenges posed by pronouns and polysemous words in Chinese NER.…”
Section: Methods Based On Deep Learningmentioning
confidence: 99%
“…As an illustration, Kameko et al [35] proposed a named entity recognition model combining BERT-CRF and multi-task learning for performing factual analysis of Japanese events. Liu et al [36] suggested a training model that uses Transformers' bi-directional encoder representation of BERT, combining BILSTM and CRF to extract specified entity classes from unstructured citrus pest and disease data. Chen et al [37] introduced the BERT-BILSTM-CRF model for extracting named entities from faulty text in power equipment.…”
Section: Knowledge Graph Constructionmentioning
confidence: 99%
“…Second, based on the OneRel model, the global dependency semantics and syntactic graph embedding representation of sentences are incorporated in the initial vector generation stage. Additionally, Bidirectional Encoder Representations from Transformers (BERT) and Bi-directional Long Short-Term Memory (Bi-LSTM) networks were used to extract semantic information from the original text [34,35]. This stage improves relational extraction by learning the spanning sentence syntactic structure features.…”
Section: Event Extraction Model Designmentioning
confidence: 99%