2016
DOI: 10.1609/aaai.v30i1.10344
|View full text |Cite
|
Sign up to set email alerts
|

What Happens Next? Event Prediction Using a Compositional Neural Network Model

Abstract: We address the problem of automatically acquiring knowledge of event sequences from text, with the aim of providing a predictive model for use in narrative generation systems. We present a neural network model that simultaneously learns embeddings for words describing events, a function to compose the embeddings into a representation of the event, and a coherence function to predict the strength of association between two events. We introduce a new development of the narrative cloze evaluation task, better sui… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 59 publications
(40 citation statements)
references
References 9 publications
0
26
0
Order By: Relevance
“…For comparison, we also report on an automatically generated MCNC task. Since we do not restrict the evaluation to narrative chains with a single protagonist, the numbers for the automatic MCNC task are lower than those reported in Granroth-Wilding and Clark (2016).…”
Section: Coherent Multiple Choice Narrative Clozementioning
confidence: 91%
See 1 more Smart Citation
“…For comparison, we also report on an automatically generated MCNC task. Since we do not restrict the evaluation to narrative chains with a single protagonist, the numbers for the automatic MCNC task are lower than those reported in Granroth-Wilding and Clark (2016).…”
Section: Coherent Multiple Choice Narrative Clozementioning
confidence: 91%
“…This basic architecture was used to generate event representations for narrative cloze tasks (Modi and Titov 2013;Modi 2016;Granroth-Wilding and Clark 2016). We adapt this architecture for our task -given the input event, whose representation is to be composed, predict the neighboring context (other events or words in the sentence).…”
Section: E = W * Tanh(h[s; P; O])mentioning
confidence: 99%
“…For example, Radinsky et al (2012) extracted generalized causality relations of two events in the form of "x causes y" from past news and applied the templates on a present news event to predict the next possible event. Granroth-Wilding et al (2016) extract event knowledge about typical sequences of events from text (2008) and learned the coherence score of two events using a compositional neural network. They aim to predict the strength of association between two events, which could be used for predicting whether an event is likely to be the next event.…”
Section: Event Predictionmentioning
confidence: 99%
“…For example, Radinsky et al (2012) extracted causality relations between two events and generalized them using ontology for prediction. Granroth-Wilding et al (2016) extracted event chains (Chambers and Jurafsky 2008) from texts and learned the coherence score of two events using a compositional neural network. Manshadi et al (2008) learned a probabilistic language model of the event sequences.…”
Section: Introductionmentioning
confidence: 99%
“…As the example above suggests, predicting "what happens next? ", also known as the Narrative Cloze (NC) task (Chambers and Jurafsky 2008;Granroth-Wilding and Clark 2016), is the preferred way of evaluating such models. In this paper, we propose a generalization of this task highlighting the importance of evaluating inferences over chains of future events.…”
Section: Introductionmentioning
confidence: 99%