2020
DOI: 10.3390/app10186429
|View full text |Cite
|
Sign up to set email alerts
|

DeNERT-KG: Named Entity and Relation Extraction Model Using DQN, Knowledge Graph, and BERT

Abstract: Along with studies on artificial intelligence technology, research is also being carried out actively in the field of natural language processing to understand and process people’s language, in other words, natural language. For computers to learn on their own, the skill of understanding natural language is very important. There are a wide variety of tasks involved in the field of natural language processing, but we would like to focus on the named entity registration and relation extraction task, which is con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…Bidirectional encoder representation from transformers (BERT) is a pretrained language model 43 . Among these transformers, the algorithm framework can capture the bidirectional relationship in words and sentences 44 .…”
Section: Methodsmentioning
confidence: 99%
“…Bidirectional encoder representation from transformers (BERT) is a pretrained language model 43 . Among these transformers, the algorithm framework can capture the bidirectional relationship in words and sentences 44 .…”
Section: Methodsmentioning
confidence: 99%
“…We tested state-of-the-art language models like KG-BERT 60 and KGLM 61 that have better MR metrics compared to the graph-embedding models and are generalizable to unseen entities or relations 62 . For example, we obtained an MR of 191 on the validation set by fine-tuning the KG-BERT architecture with the BioBERT as a pre-trained backbone instead of the BERT, which is a significant improvement over the RotatE MR of 1,139.…”
Section: Discussionmentioning
confidence: 99%
“…Therefore, many deep learning-based biomedical named entity recognition techniques have emerged 6 , mainly including some mainstream models, such as BiLSTM-CRF model 7 , and BiLSTM-CNN-CRF model 8 , etc. ; and advanced models improved on the basis of mainstream models, such as TaughtNet model 9 , DeNERT-KG model 10 , and models that incorporate machine reading comprehension 11,12 . With the development of pre-trained language models, large-scale language models pre-trained on biomedical corpora and fine-tuned on the BioNER dataset 13,14 have great potential to improve biomedical named entity recognition, and can be improved as the availability of training data increases.…”
Section: Introductionmentioning
confidence: 99%