2021 7th International Symposium on System and Software Reliability (ISSSR) 2021
DOI: 10.1109/isssr53171.2021.00029
|View full text |Cite
|
Sign up to set email alerts
|

Chinese Named Entity Recognition based on BERT-Transformer-BiLSTM-CRF Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…(6) BERT-Transformer-BiLSTM-CRF [23]: The BERT pre-trained model generates character vectors, the Transformer encoding area constructs contextual long-range semantic features of text, the BiLSTM neural network extracts semantic features, and the CRF inference layer classifies different entities.…”
Section: Results and Analysis A Comparative Experimental Designmentioning
confidence: 99%
“…(6) BERT-Transformer-BiLSTM-CRF [23]: The BERT pre-trained model generates character vectors, the Transformer encoding area constructs contextual long-range semantic features of text, the BiLSTM neural network extracts semantic features, and the CRF inference layer classifies different entities.…”
Section: Results and Analysis A Comparative Experimental Designmentioning
confidence: 99%
“…Even though the current performance in data constrained settings is not ideal, there can be lot of work to carry out towards better model performance via mode refinement, optimisation, and augmentation, e.g. the more complex structures and system combinations used by Wang and Su (2022); Gan et al (2021);Yan et al (2019).…”
Section: Discussionmentioning
confidence: 99%
“…To facilitate this research, we adopt off-the-shelf open-source Bioformer 5 from HuggingFace (Wolf et al, 2019) which is one of the latest PLMs in biomedical domain and reported comparable and even better scores than BioBERT and PubMed- and Wang, 2019;Gan et al, 2021;Zheng et al, 2021) which applied Transformer and CRF combined models for spoken language understanding, Chinese NER, and power meter NER 6 . However, none of above mentioned work has ever applied TransformerCRF models into clinical domain text mining which topic is the focus of this paper.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Liu et al [39] proposed the BERT-BiLSTM-CRF recognition method and applied it to research on citrus pests and diseases. Gan et al [40] proposed the BERT-Transformer-BiLSTM-CRF model to handle the challenges posed by pronouns and polysemous words in Chinese NER. Li et al [41] addressed the issue of excessive parameters and long training times in BERT by introducing the BERT-IDCNN-CRF model.…”
Section: Methods Based On Deep Learningmentioning
confidence: 99%