Proceedings of the 29th ACM International Conference on Information &Amp; Knowledge Management 2020
DOI: 10.1145/3340531.3412047
|View full text |Cite
|
Sign up to set email alerts
|

Continual Domain Adaptation for Machine Reading Comprehension

Abstract: Machine reading comprehension (MRC) has become a core component in a variety of natural language processing (NLP) applications such as question answering and dialogue systems. It becomes a practical challenge that an MRC model needs to learn in non-stationary environments, in which the underlying data distribution changes over time. A typical scenario is the domain drift, i.e. different domains of data come one after another, where the MRC model is required to adapt to the new domain while maintaining previous… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…Talmor and Berant (2019), Khashabi et al (2020) and Lourie et al (2021) improve the generalization by training on multiple datasets. Su et al (2020) introduces Adapters (Houlsby et al, 2019) to accommodate each domain. Theses method requires a quite amount of annotated data to work.…”
Section: Related Workmentioning
confidence: 99%
“…Talmor and Berant (2019), Khashabi et al (2020) and Lourie et al (2021) improve the generalization by training on multiple datasets. Su et al (2020) introduces Adapters (Houlsby et al, 2019) to accommodate each domain. Theses method requires a quite amount of annotated data to work.…”
Section: Related Workmentioning
confidence: 99%
“…Existing studies for continual MRC can mainly be divided into three categories. The first class is model expansion techniques that design domainindividual classifier for each in coming domain (Su et al, 2020). However, it is expensive and unpractical in real-world.…”
Section: Mrcmentioning
confidence: 99%
“…Few previous studies apply continual learning to MRC. Su et al (2020) adapted EWC method and enlarged the MRC architecture when a new domain arrives. Su et al (2020) added a penalty regularization that restricts the change of important parameters to prevent forgetting.…”
Section: In Mrcmentioning
confidence: 99%
See 2 more Smart Citations