Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.20
|View full text |Cite
|
Sign up to set email alerts
|

Refining Sample Embeddings with Relation Prototypes to Enhance Continual Relation Extraction

Abstract: Continual learning has gained increasing attention in recent years, thanks to its biological interpretation and efficiency in many realworld applications. As a typical task of continual learning, continual relation extraction (CRE) aims to extract relations between entities from texts, where the samples of different relations are delivered into the model continuously. Some previous works have proved that storing typical samples of old relations in memory can help the model keep a stable understanding of old re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 22 publications
(40 citation statements)
references
References 32 publications
0
16
0
Order By: Relevance
“…In order to mitigate catastrophic forgetting in continual relational extraction, episodic memory modules have been used in previous work (Wang et al, 2019;Han et al, 2020;Cui et al, 2021), to store small samples in historical tasks. Inspired by (Cui et al, 2021), we store several representative samples for each relation. Therefore, the episodic memory module for the observed relations in…”
Section: Problem Formulationmentioning
confidence: 99%
See 4 more Smart Citations
“…In order to mitigate catastrophic forgetting in continual relational extraction, episodic memory modules have been used in previous work (Wang et al, 2019;Han et al, 2020;Cui et al, 2021), to store small samples in historical tasks. Inspired by (Cui et al, 2021), we store several representative samples for each relation. Therefore, the episodic memory module for the observed relations in…”
Section: Problem Formulationmentioning
confidence: 99%
“…In order to make the model not forget the relevant knowledge of the old task when it learns the new task, some samples need to be stored in M r . Inspired by (Han et al, 2020;Cui et al, 2021), we use k-means to cluster each relation, where the number of clusters is the number of samples that need to be stored for each class. Then, the relation representation closest to the center is selected and stored in memory for each cluster.…”
Section: Selecting Typical Samples For Memorymentioning
confidence: 99%
See 3 more Smart Citations