2019 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 2019
DOI: 10.1109/bibm47256.2019.8983370
|View full text |Cite
|
Sign up to set email alerts
|

Fine-tuning BERT for Joint Entity and Relation Extraction in Chinese Medical Text

Abstract: Entity and relation extraction is the necessary step in structuring medical text. However, the feature extraction ability of the bidirectional long short term memory network in the existing model does not achieve the best effect. At the same time, the language model has achieved excellent results in more and more natural language processing tasks. In this paper, we present a focused attention model for the joint entity and relation extraction task. Our model integrates well-known BERT language model into joint… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
42
0
2

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 70 publications
(46 citation statements)
references
References 18 publications
0
42
0
2
Order By: Relevance
“…Machine learning can, therefore, be applied to the named entity recognition to help automatic tagging [27]. Recent works in using BERT in the NER task [28][29][30] which consider BERT-NER in an open domain are HAREM I Portugese language, and Chinese Medical Text respectively. In our work, we focus on a tourism data set.…”
Section: Deep Learning and Bertmentioning
confidence: 99%
“…Machine learning can, therefore, be applied to the named entity recognition to help automatic tagging [27]. Recent works in using BERT in the NER task [28][29][30] which consider BERT-NER in an open domain are HAREM I Portugese language, and Chinese Medical Text respectively. In our work, we focus on a tourism data set.…”
Section: Deep Learning and Bertmentioning
confidence: 99%
“…They also compared feature-based and fine-tuning based strategies. Xue et al in [49] fine-tuned the BERT model to focus on the NER and Relation Extraction task words in medical texts. For both the task, a shared parameter layer was employed.…”
Section: Contextualized Embeddings Based Modelsmentioning
confidence: 99%
“…On the other hand, many recent state-of-the-art architectures used CRF layer after a contextual Language Model [31], [45], [49], [51]. Both Li [31] and Yuan et al [51] used BERT-CRF model for sequence labelling tasks.…”
Section: F Conditional Random Fieldmentioning
confidence: 99%
See 1 more Smart Citation
“…The joint model has been extensively researched in various NLP tasks, for example, joint entity and relation extraction [ 37 ], joint event extraction and visualization [ 38 ], joint event detection and summarization [ 5 , 39 ], joint event detection and prediction [ 23 ], and joint parsing and name entity recognition [ 40 ]. The key to joint models is designing shared features to capture mutual information between the integrated tasks.…”
Section: Related Workmentioning
confidence: 99%