Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing 2017
DOI: 10.18653/v1/d17-1006
|View full text |Cite
|
Sign up to set email alerts
|

Integrating Order Information and Event Relation for Script Event Prediction

Abstract: There has been a recent line of work automatically learning scripts from unstructured texts, by modeling narrative event chains. While the dominant approach group events using event pair relations, LSTMs have been used to encode full chains of narrative events. The latter has the advantage of learning long-range temporal orders 1 , yet the former is more adaptive to partial orders. We propose a neural model that leverages the advantages of both methods, by using LSTM hidden states as features for event pair mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
76
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 66 publications
(84 citation statements)
references
References 18 publications
0
76
0
Order By: Relevance
“…Therefore our proposed method should unify both short-term sequential dependency (at both individual-level and union-level) and long-term sequential dependency. Inspired by [18]…”
Section: Multi-relational Dependency Modeling For Short-term Patternmentioning
confidence: 99%
“…Therefore our proposed method should unify both short-term sequential dependency (at both individual-level and union-level) and long-term sequential dependency. Inspired by [18]…”
Section: Multi-relational Dependency Modeling For Short-term Patternmentioning
confidence: 99%
“…Recently a growing number of studies focus on event-centered commonsense reasoning, which mainly concentrates on two areas, script event prediction and story ending generation/choosing. Script event prediction concerns with the temporal relationships between script events (Granroth-Wilding and Clark, 2016), which requires models to choose a correct subsequent triple-organized event among the candidates (Wang et al, 2017). Prior work mainly focused on modeling event pairs (Granroth-Wilding and Clark, 2016), event chains (Wang et al, 2017) and event graph (Li et al, 2018) to predict the subse-quent event.…”
Section: Event-centered Commonsense Reasoningmentioning
confidence: 99%
“…Script event prediction concerns with the temporal relationships between script events (Granroth-Wilding and Clark, 2016), which requires models to choose a correct subsequent triple-organized event among the candidates (Wang et al, 2017). Prior work mainly focused on modeling event pairs (Granroth-Wilding and Clark, 2016), event chains (Wang et al, 2017) and event graph (Li et al, 2018) to predict the subse-quent event. Story ending generation focuses on generating plausible story endings , which requires models to understand the story context, and keep generated endings logically consistent with it (Peng et al, 2017;Guan et al, 2019).…”
Section: Event-centered Commonsense Reasoningmentioning
confidence: 99%
“…Multiple Choice version of the Narrative Cloze task (MCNC) proposed by Granroth-Wilding and Clark (2016); Wang et al (2017), aims to eval-uate understanding of a script by predicting the next event given several context events. Presenting a chain of contextual events e 1 , e 2 , ..., e n−1 , the task is to select the next event from five event candidates, one of which is correct and the others are randomly sampled elsewhere in the corpus.…”
Section: Narrative Clozementioning
confidence: 99%
“…Pichotta and Mooney (2016) introduced a LSTM-based language model for event prediction. Wang et al (2017) used dynamic memory as attention in LSTM for prediction. It is encouraging that by using event knowledge extracted from automatically identified narratives, we achieved the best event prediction performance, which is 2.2% higher than the best neural network model.…”
Section: Narrative Clozementioning
confidence: 99%