2021
DOI: 10.1609/aaai.v35i7.16733
|View full text |Cite
|
Sign up to set email alerts
|

Automated Storytelling via Causal, Commonsense Plot Ordering

Abstract: Automated story plot generation is the task of generating a coherent sequence of plot events. Causal relations between plot events are believed to increase the perception of story and plot coherence. In this work, we introduce the concept of soft causal relations as causal relations inferred from commonsense reasoning. We demonstrate C2PO, an approach to narrative generation that operationalizes this concept through Causal, Commonsense Plot Ordering. Using human-participant protocols, we evaluate our system ag… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 33 publications
(17 citation statements)
references
References 25 publications
0
17
0
Order By: Relevance
“…; Peng et al (2021) explore using external knowledge such as commonsense for story generation. Xu et al (2020); Ammanabrolu et al (2021) combine storyline planning and commonsense reasoning. We find that although there are studies which explore the use of GPT3 for story generation.…”
Section: Related Workmentioning
confidence: 99%
“…; Peng et al (2021) explore using external knowledge such as commonsense for story generation. Xu et al (2020); Ammanabrolu et al (2021) combine storyline planning and commonsense reasoning. We find that although there are studies which explore the use of GPT3 for story generation.…”
Section: Related Workmentioning
confidence: 99%
“…[25] extracted the representation from COMET to improve the commonsense reasoning of the model. [26] used a generation model fine-tuned on the commonsense knowledge base to complete the tasks.…”
Section: Commonsense Knowledge Integrated Generationmentioning
confidence: 99%
“…Each code corresponds to a fixed-length span, which does not always agree with real text structures and makes it hard to model specific semantic dependencies. Some studies tried to incorporate external knowledge or reasoning models to guide commonsense story generation (Guan et al 2020;Xu et al 2020;Ammanabrolu et al 2021), which may lack generalization to other domains such as news. Another line improved coherence by learning high-level representations of prefix sentences (Li, Luong, and Jurafsky 2015;Guan et al 2021), which does not emphasize the central role of entities.…”
Section: Related Workmentioning
confidence: 99%