2023
DOI: 10.1007/s13278-023-01095-8
|View full text |Cite
|
Sign up to set email alerts
|

Relation extraction: advancements through deep learning and entity-related features

Abstract: Capturing semantics and structure surrounding the target entity pair is crucial for relation extraction. The task is challenging due to the limited semantic elements and structural features of the target entity pair within a sentence. To tackle this problem, this paper introduces an approach that fuses entity-related features under convolutional neural networks and graph convolution neural networks. Our approach combines the unit features of the target entity pair to generate corresponding fusion features and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 30 publications
(16 reference statements)
0
1
0
Order By: Relevance
“…This gives them strong generalization ability, adaptability, and scalability. However, training or fine-tuning them for downstream applications such as RE requires labeled data (Zhao et al, 2023 ). Since we have limited labeled data, supervised methods such as those mentioned above cannot be applied in our study.…”
Section: Related Workmentioning
confidence: 99%
“…This gives them strong generalization ability, adaptability, and scalability. However, training or fine-tuning them for downstream applications such as RE requires labeled data (Zhao et al, 2023 ). Since we have limited labeled data, supervised methods such as those mentioned above cannot be applied in our study.…”
Section: Related Workmentioning
confidence: 99%
“…With the emergence of DL, models that employ neural architectures such as convolutional neural networks (CNNs) (Liu et al, 2013;dos Santos et al, 2015), recurrent neural networks (RNN) (Vu et al, 2016;Zhang and Wang, 2015), graph convolutional networks (GCN) (Zhu et al, 2019), attention-based neural networks (Wang et al, 2016;Xiao and Liu, 2016), and transformer-based language models (Vaswani et al, 2017;Le Guillarme and Thuiller, 2022) have been utilized for RE tasks. However, similar to traditional ML models, training or fine-tuning DL models for downstream applications such as RE also requires labeled data (Zhao et al, 2023).…”
Section: Related Workmentioning
confidence: 99%