Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.436
|View full text |Cite
|
Sign up to set email alerts
|

ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning

Abstract: While pre-trained language models (PTLMs) have achieved noticeable success on many NLP tasks, they still struggle for tasks that require event temporal reasoning, which is essential for event-centric applications. We present a continual pre-training approach that equips PTLMs with targeted knowledge about event temporal relations. We design self-supervised learning objectives to recover masked-out event and temporal indicators and to discriminate sentences from their corrupted counterparts (where event or temp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(17 citation statements)
references
References 29 publications
0
17
0
Order By: Relevance
“…Some methods focus on more specific aspects of events and their correlations. DEER (Han et al, 2020b) performs temporal and event masking predictions for temporal relations. Lin et al (2021) propose to recover a temporally-disordered or event-missing sequence for temporal and causal relations.…”
Section: Related Workmentioning
confidence: 99%
“…Some methods focus on more specific aspects of events and their correlations. DEER (Han et al, 2020b) performs temporal and event masking predictions for temporal relations. Lin et al (2021) propose to recover a temporally-disordered or event-missing sequence for temporal and causal relations.…”
Section: Related Workmentioning
confidence: 99%
“…As previously mentioned, the direction of an entitytrace-induced linkage is determined by the narrated order of text segments within contexts, however, in circumstances such as the fourth row in Table 1, the narrative order can be inconsistent with the actual temporal order of the associated events. To alleviate such inconsistency, we apply an event temporal relation prediction model (Han et al, 2021b) to fix the linkage directions. 7 The utilized model predicts 7 These do not include linkages decided by the keywords.…”
Section: Incorporating Temporal Relationsmentioning
confidence: 99%
“…preconditions should occur prior to an action), the narrated order of events may not be consistent with their actual temporal order. We thus adopt a temporal relation resolution module (Han et al, 2021b) to alleviate such an issue.…”
Section: Introductionmentioning
confidence: 99%
“…Second, some other approaches mine event information in raw corpus instead of the KGs. DEER (Han et al, 2020) performs continual pre-training via temporal and event masking predictions, i.e., new masking schemes for MLM, to focus on event temporal relations. Lin et al (2020) leverage BART structure and propose to recover a temporally-disordered or event-missing sequence to the original one by a denoising autoencoder.…”
Section: Commonsense-centricmentioning
confidence: 99%
“…Limitation. First, EventBERT is focused on correlation-based event reasoning, and not general enough to every event correlation reasoning task (e.g., event temporal reasoning as in (Han et al, 2020;Lin et al, 2020)). Second, we evaluate EventBERT on deterministic tasks, e.g., multichoice and Cloze-type question answering, due to their stable metrics and widely-available baselines.…”
Section: Case Study Error Analysis and Limitationmentioning
confidence: 99%