Findings of the Association for Computational Linguistics: EMNLP 2020 2020
DOI: 10.18653/v1/2020.findings-emnlp.325
|View full text |Cite
|
Sign up to set email alerts
|

Narrative Text Generation with a Latent Discrete Plan

Abstract: Past work on story generation has demonstrated the usefulness of conditioning on a generation plan to generate coherent stories. However, these approaches have used heuristics or off-the-shelf models to first tag training stories with the desired type of plan, and then train generation models in a supervised fashion. In this paper, we propose a deep latent variable model that first samples a sequence of anchor words, one per sentence in the story, as part of its generative process. During training, our model t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 27 publications
(33 reference statements)
0
1
0
Order By: Relevance
“…As our model involves plot generation and character modeling, it is related to work on plot planning (Riedl and Young, 2010;Li et al, 2013;Martin et al, 2018;Yao et al, 2019;Jhamtani and Berg-Kirkpatrick, 2020), character modeling (Clark et al, 2018;, and the interplay between the two (Riedl and Young, 2010). Our work is different in that it explicitly requires performing inference on lengthy documents about characters.…”
Section: Related Workmentioning
confidence: 99%
“…As our model involves plot generation and character modeling, it is related to work on plot planning (Riedl and Young, 2010;Li et al, 2013;Martin et al, 2018;Yao et al, 2019;Jhamtani and Berg-Kirkpatrick, 2020), character modeling (Clark et al, 2018;, and the interplay between the two (Riedl and Young, 2010). Our work is different in that it explicitly requires performing inference on lengthy documents about characters.…”
Section: Related Workmentioning
confidence: 99%