2019
DOI: 10.48550/arxiv.1909.03329
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

LAMOL: LAnguage MOdeling for Lifelong Language Learning

Fan-Keng Sun,
Cheng-Hao Ho,
Hung-Yi Lee

Abstract: Most research on lifelong learning (LLL) applies to images or games, but not language. Here, we introduce LAMAL, a simple yet effective method for LLL based on language modeling. LAMAL replays pseudo samples of previous tasks while requiring no extra memory or model capacity. To be specific, LAMAL is a language model learning to solve the task and generate training samples at the same time. At the beginning of training a new task, the model generates some pseudo samples of previous tasks to train alongside the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(18 citation statements)
references
References 15 publications
(18 reference statements)
0
18
0
Order By: Relevance
“…Yan et al (2021) proposes a new two-stage learning method that uses dynamic expandable representation for more effective incremental conceptual modelling. Among these methods, memory-based methods are the most effective in NLP tasks (Wang et al, 2019;Sun et al, 2019;d'Autume et al, 2019). Inspired by the success of memory-based methods in the field of NLP, we use the framework of memory replay to learn new relations that are constantly emerging.…”
Section: Continual Learningmentioning
confidence: 99%
“…Yan et al (2021) proposes a new two-stage learning method that uses dynamic expandable representation for more effective incremental conceptual modelling. Among these methods, memory-based methods are the most effective in NLP tasks (Wang et al, 2019;Sun et al, 2019;d'Autume et al, 2019). Inspired by the success of memory-based methods in the field of NLP, we use the framework of memory replay to learn new relations that are constantly emerging.…”
Section: Continual Learningmentioning
confidence: 99%
“…Yan et al (2021) proposes a new two-stage learning method that uses dynamic expandable representation for more effective incremental conceptual modelling. Among these methods, memory-based methods are the most effective in NLP tasks (Wang et al, 2019;Sun et al, 2019;de Masson D'Autume et al, 2019). Inspired by the success of memory-based methods in the field of NLP, we use the framework of memory replay to learn new relations that are constantly emerging.…”
Section: Continual Learningmentioning
confidence: 99%
“…These methods for LL mostly focus on tasks of the same type (referred as domains in this work). Recently, Sun et al (2019) proposes LAMOL, a general framework designed for lifelong language learning (LLL), where the model needs to continually learn from different domains as well as different types of NLP tasks.…”
Section: Lifelong Learningmentioning
confidence: 99%
“…More recent methods attempt to learn from different types of tasks. These include LAMOL (Sun et al, 2019) and its improvements (Chuang et al, 2020;Sun et al, 2020;Kanwatchara et al, 2021). Despite the effectiveness of these methods in LLL, there are several limitations.…”
Section: Introductionmentioning
confidence: 99%