2020
DOI: 10.1609/aaai.v34i05.6281
|View full text |Cite
|
Sign up to set email alerts
|

Neural Snowball for Few-Shot Relation Learning

Abstract: Knowledge graphs typically undergo open-ended growth of new relations. This cannot be well handled by relation extraction that focuses on pre-defined relations with sufficient training data. To address new relations with few-shot instances, we propose a novel bootstrapping approach, Neural Snowball, to learn new relations by transferring semantic knowledge about existing relations. More specifically, we use Relational Siamese Networks (RSN) to learn the metric of relational similarities between instances based… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 51 publications
(26 citation statements)
references
References 18 publications
0
26
0
Order By: Relevance
“…Meta-learning based methods [11,39,40] have shown tremendous success for few-shot learning in many tasks such as few-shot image generation [47], image classification [56], and domain adaptation [63]. Following the success of such approaches, few-shot learning in NLP have been investigated for tasks such as text classification [14,59,67], entity-relation extraction [13,34], and few-shot slot filling [12,21,33]. The authors in [33] exploited regular expressions for few-shot slot filling, Prototypical Network was employed in [12], and the authors in [21] extended the CRF model by introducing collapsed dependency transition to transfer label dependency patterns.…”
Section: Related Workmentioning
confidence: 99%
“…Meta-learning based methods [11,39,40] have shown tremendous success for few-shot learning in many tasks such as few-shot image generation [47], image classification [56], and domain adaptation [63]. Following the success of such approaches, few-shot learning in NLP have been investigated for tasks such as text classification [14,59,67], entity-relation extraction [13,34], and few-shot slot filling [12,21,33]. The authors in [33] exploited regular expressions for few-shot slot filling, Prototypical Network was employed in [12], and the authors in [21] extended the CRF model by introducing collapsed dependency transition to transfer label dependency patterns.…”
Section: Related Workmentioning
confidence: 99%
“…Due to the scarcity of labeled training data, FL faces the problem of over-fitting. Existing methods to overcome overfitting include: (i) model-based methods that explore how to reduce the hypothesis space of the fewshot task (Triantafillou et al, 2017;Hu et al, 2018), (ii) data-based methods that try to augment additional data to the few-shot set (Benaim & Wolf, 2018;Gao et al, 2020b), and (iii) algorithmbased solutions that aim to improve strategies for searching for the best hypothesis. Recently, a new paradigm introducing prompts achieves promising results for few-shot language learning as shown by GPT-3 (Brown et al, 2020), PET (Schick & Schütze, 2020) and LM-BFF (Gao et al, 2020a).…”
Section: Few-shot Learningmentioning
confidence: 99%
“…The ProtoNet [199] is popular in metric-based LSL RE/C tasks [56,72,252] including designs multi-level attention schemes to highlight the crucial instances and features respectively [72], adopting the large-margin ProtoNet with fine-grained sentence features [56], integrating external descriptions to enhance the origin ProtoNet [247], and using a multi-level structure to encode the metric prototype through various stage [252]. Metric-based LSL has also been integrated with semi-supervised learning, exploring novel relations from the seed relation set with a relational metric and snowball strategy [73]. IV.…”
Section: IIImentioning
confidence: 99%