Findings of the Association for Computational Linguistics: ACL 2022 2022
DOI: 10.18653/v1/2022.findings-acl.62
|View full text |Cite
|
Sign up to set email alerts
|

A Simple yet Effective Relation Information Guided Approach for Few-Shot Relation Extraction

Abstract: Few-Shot Relation Extraction aims at predicting the relation for a pair of entities in a sentence by training with a few labelled examples in each relation. Some recent works have introduced relation information (i.e., relation labels or descriptions) to assist model learning based on Prototype Network. However, most of them constrain the prototypes of each relation class implicitly with relation information, generally through designing complex network structures, like generating hybrid features, combining wit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 9 publications
(9 reference statements)
0
8
0
Order By: Relevance
“…This approach optimized the measurement between sentences and abstracted the core characteristics of relation classes by inferring prototypes, thereby further enhancing the performance of relation extraction. Liu et al [29] introduced a straightforward yet powerful approach incorporating relation information into the prototypical network. The fundamental concept involves incorporating relation representations through a direct addition operation rather than designing intricate structures.…”
Section: Related Workmentioning
confidence: 99%
“…This approach optimized the measurement between sentences and abstracted the core characteristics of relation classes by inferring prototypes, thereby further enhancing the performance of relation extraction. Liu et al [29] introduced a straightforward yet powerful approach incorporating relation information into the prototypical network. The fundamental concept involves incorporating relation representations through a direct addition operation rather than designing intricate structures.…”
Section: Related Workmentioning
confidence: 99%
“…( 9) DAPL [53]: Utilizes the shortest dependency path information between entities in the prototype network. (10) SimpleF-SRE [18]: Concatenates the two representations of relational information and directly incorporates them into the prototype representation. ( 11) CP [48]: Utilizes the entity-masking contrast pre-training framework by randomly masking entity references.…”
Section: Baselinesmentioning
confidence: 99%
“…HCRP [17] employs relation-prototype contrastive learning to better leverage relationship information and obtain diverse and discriminative prototype representations. SimpleFSRE [18] improves model performance by concatenating two representations of relationship information and directly incorporating them into the prototype representation. PRM [19] combines a gating mechanism to utilize relationship description information, determining the degree of preservation and an update of both the prototype and relationship information.…”
Section: Introductionmentioning
confidence: 99%
“…Many approaches (Qu et al, 2020; incorporate external knowledge to improve performance given the scarcity of training data. Another line of FSRE research (Gao et al, 2019a;Liu et al, 2022b) relies solely on the input text and provided relation description information, without incorporating external knowledge. Most of the previous methods usually adopt complicated designs of neural networks or introduce external knowledge, which can be labor-intensive in realistic scenarios.…”
Section: Related Workmentioning
confidence: 99%