Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2022
DOI: 10.3390/app12136361
|View full text |Cite
|
Sign up to set email alerts
|

Extraction of Joint Entity and Relationships with Soft Pruning and GlobalPointer

Abstract: In recent years, scholars have paid increasing attention to the joint entity and relation extraction. However, the most difficult aspect of joint extraction is extracting overlapping triples. To address this problem, we propose a joint extraction model based on Soft Pruning and GlobalPointer, short for SGNet. In the first place, the BERT pretraining model is used to obtain the text word vector representation with contextual information, and then the local and non-local information of the word vector is obtaine… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 43 publications
0
10
0
Order By: Relevance
“…BERT-GlobalPointer In pointer network designed for named entity recognition, we usually utilize two modules to identify the head and tail of the entity respectively, which leads to inconsistent training and prediction. GlobalPointer [ 21 ] treats both ends as a whole to deal with such inconsistencies. Therefore, GlobalPointer has a more global view.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…BERT-GlobalPointer In pointer network designed for named entity recognition, we usually utilize two modules to identify the head and tail of the entity respectively, which leads to inconsistent training and prediction. GlobalPointer [ 21 ] treats both ends as a whole to deal with such inconsistencies. Therefore, GlobalPointer has a more global view.…”
Section: Methodsmentioning
confidence: 99%
“…There are two mainstreams of deep learning-based EE approaches: (1) Pipeline-based approach [3,10,21,22,24,28,39,45] that first performs ETI and then identifies arguments base on the results of ETI. (2) Joint-based approach [16,25,29,31,44,47,49] that treats EE as a structure extraction task, and predicts event type and corresponding arguments at the same time.…”
Section: Related Workmentioning
confidence: 99%
“…A heterogeneous graph attention network is constructed including three types of nodes-word nodes, relation nodes, and subject nodes-to learn and enhance the embedded representation of parts for relation extraction, combined with a gating mechanism to control the updating of the nodes. The experimental results demonstrate that the ERHGA can effectively solve the relational triple extraction problem and outperform all baselines [1][2][3][8][9][10][11][12][13][14][15].…”
Section: Introductionmentioning
confidence: 95%
“…RMAN [14] applied two multi-head attention layers to add extra semantic information. A Gaussian graph generator and an attention-guiding layer were utilized to address the lack of information, and the GlobalPointer decoder was used to tackle the extraction of problematic overlapping triples by Liang et al [15].…”
Section: Ere-related Workmentioning
confidence: 99%
See 1 more Smart Citation