2021
DOI: 10.1109/access.2021.3071787
|View full text |Cite
|
Sign up to set email alerts
|

Lifelong Language Learning With the Most Forgotten Knowledge

Abstract: Lifelong language learning enables a language model to accumulate knowledge throughout training on a stream of text data. Recent research on lifelong language learning is based on samples of previous tasks from an episodic memory or generative model. LAMOL, a representative generative modelbased lifelong language learning model, preserves the previous information with generated pseudo-old samples, which is suboptimal. In this paper, we propose an improved version of LAMOL, MFK-LAMOL, which constructs generativ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…L2KD (Chuang et al, 2020) and DnR (Sun et al, 2020b) use knowledge distillation to extend LAMOL. MFK-LAMOL (Choi and Kang, 2021) makes replay more efficient by using more forgotten pseudo-samples in generative replay. Rational-LAMOL (Kanwatchara et al, 2021) uses critical freezing guided by supervised or unsupervised rationale.…”
Section: Related Workmentioning
confidence: 99%
“…L2KD (Chuang et al, 2020) and DnR (Sun et al, 2020b) use knowledge distillation to extend LAMOL. MFK-LAMOL (Choi and Kang, 2021) makes replay more efficient by using more forgotten pseudo-samples in generative replay. Rational-LAMOL (Kanwatchara et al, 2021) uses critical freezing guided by supervised or unsupervised rationale.…”
Section: Related Workmentioning
confidence: 99%