2022
DOI: 10.1186/s12911-022-01977-5
|View full text |Cite
|
Sign up to set email alerts
|

BertSRC: transformer-based semantic relation classification

Abstract: The relationship between biomedical entities is complex, and many of them have not yet been identified. For many biomedical research areas including drug discovery, it is of paramount importance to identify the relationships that have already been established through a comprehensive literature survey. However, manually searching through literature is difficult as the amount of biomedical publications continues to increase. Therefore, the relation classification task, which automatically mines meaningful relati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 41 publications
0
3
0
Order By: Relevance
“…BERT-GMAN [11] proposed a relationship extraction model based on BERTgated multi-window attention network, which achieved good results in the Semeval-2010 Task 8 dataset. BertSRC [42] proposed to extract the entity relationship in the medical field based on BERT and achieved good results in the data set of the proprietary field. AugFake-BERT [43] proposes a BERT-based data augmentation model.…”
Section: Methodsmentioning
confidence: 99%
“…BERT-GMAN [11] proposed a relationship extraction model based on BERTgated multi-window attention network, which achieved good results in the Semeval-2010 Task 8 dataset. BertSRC [42] proposed to extract the entity relationship in the medical field based on BERT and achieved good results in the data set of the proprietary field. AugFake-BERT [43] proposes a BERT-based data augmentation model.…”
Section: Methodsmentioning
confidence: 99%
“…These DNN models learn higher-order, abstract feature representations from sentences. With the emergence of DNNs, models that employ neural architectures such as convolutional neural networks (CNNs; Liu et al, 2013;dos Santos et al, 2015), recurrent neural networks (RNN; Zhang and Wang, 2015;Vu et al, 2016), graph convolutional networks (GCN; Zhu et al, 2019), attention-based neural networks (Wang et al, 2016;Xiao and Liu, 2016), and transformer-based language models (Vaswani et al, 2017;Lee et al, 2022) have been utilized for RE tasks. Like traditional ML-based models, DNN-based models learn features from data.…”
Section: Relation Extractionmentioning
confidence: 99%
“…Two types of BERT models, BERT BASE and BERT LARGE , are available [38]. Some articles covering various related modifications of BERT can be found in [39][40][41][42].…”
Section: Preliminarymentioning
confidence: 99%