Findings of the Association for Computational Linguistics: ACL 2022 2022
DOI: 10.18653/v1/2022.findings-acl.269
|View full text |Cite
|
Sign up to set email alerts
|

Event Transition Planning for Open-ended Text Generation

Abstract: Open-ended text generation tasks, such as dialogue generation and story completion, require models to generate a coherent continuation given limited preceding context. The openended nature of these tasks brings new challenges to the neural auto-regressive text generators nowadays. Despite these neural models are good at producing human-like text, it is difficult for them to arrange causalities and relations between given facts and possible ensuing events. To bridge this gap, we propose a novel two-stage method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 19 publications
0
1
0
Order By: Relevance
“…To handle the task of story ending generation, Huang et al (2021) propose a context-aware multi-level graph convolutional network dependency analysis tree to capture dependencies and context clues more efficiently. Li et al (2022b) propose a novel coarse-grained to fine-grained two-stage approach that more explicitly generates subsequent events in open-ended text generation. Another task related to script events is event process typing which we focus on in this paper, Chen et al (2020) propose a model that utilizes a pretrained language model to encode the entire process and action labels into process feature vectors and action label vectors, respectively, and then predicts the action whose feature vector is closest to the process as the action label of the process.…”
Section: Script Event Learningmentioning
confidence: 99%
“…To handle the task of story ending generation, Huang et al (2021) propose a context-aware multi-level graph convolutional network dependency analysis tree to capture dependencies and context clues more efficiently. Li et al (2022b) propose a novel coarse-grained to fine-grained two-stage approach that more explicitly generates subsequent events in open-ended text generation. Another task related to script events is event process typing which we focus on in this paper, Chen et al (2020) propose a model that utilizes a pretrained language model to encode the entire process and action labels into process feature vectors and action label vectors, respectively, and then predicts the action whose feature vector is closest to the process as the action label of the process.…”
Section: Script Event Learningmentioning
confidence: 99%
“…Early approaches to automatic story generation relied on graph-based planning and hand-crafted rules to structure narratives (Meehan, 1977;Callaway and Lester, 2002;Riedl and Young, 2004;Li et al, 2013). More recent works generate stories by finetuning on large-scale PLMs (See et al, 2019) to improve its fluency and incorporating structured knowledge such as planned events Fang et al, 2021;Li et al, 2022), summaries (Yao et al, 2019Tan et al, 2021;Sun et al, 2020), and external knowledge (Guan et al, 2019;Xu et al, 2020b;Guan et al, 2020) to enhance its coherence and consistency. Our story generation models are also finetuned on the large-scale PLMs to generate text following the given summaries.…”
Section: Related Workmentioning
confidence: 99%
“…These unrelated nodes are distracting and can even disturb the subsequent generation. To drop the redundant nodes in the interaction graph, we propose a Graph Pruning (GP) mechanism similar to the Dynamic Graph Pruning (DGP) [39] mechanism. The GP employs the gate mechanism to parse the connection between the schema node s i ∈ S = T ∪ C based on its relevance with the question node in the heterogeneous graph to achieve graph pruning.…”
Section: Graph Pruningmentioning
confidence: 99%