2019
DOI: 10.48550/arxiv.1912.01496
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Knowledge-Enriched Visual Storytelling

Abstract: Stories are diverse and highly personalized, resulting in a large possible output space for story generation. Existing endto-end approaches produce monotonous stories because they are limited to the vocabulary and knowledge in a single training dataset. This paper introduces KG-Story, a three-stage framework that allows the story generation model to take advantage of external Knowledge Graphs to produce interesting stories. KG-Story distills a set of representative words from the input prompts, enriches the wo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 25 publications
(9 reference statements)
0
9
0
Order By: Relevance
“…This plan-and-write strategy (Yao et al 2019) can substantially increase the diversity of the stories. During the planning, to enhance the concepts that models can obtain, some researchers (Hsu et al 2019a;Yang et al 2019) introduce external commonsense knowledge database such as OpenIE (Angeli, Premkumar, and Manning 2015), Visual Genome (Krishna et al 2017) or ConceptNet in the VST task. Their results show that using external knowledge base helps to generate more informative sentences.…”
Section: Two-stage Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This plan-and-write strategy (Yao et al 2019) can substantially increase the diversity of the stories. During the planning, to enhance the concepts that models can obtain, some researchers (Hsu et al 2019a;Yang et al 2019) introduce external commonsense knowledge database such as OpenIE (Angeli, Premkumar, and Manning 2015), Visual Genome (Krishna et al 2017) or ConceptNet in the VST task. Their results show that using external knowledge base helps to generate more informative sentences.…”
Section: Two-stage Methodsmentioning
confidence: 99%
“…Most previous works on VST construct end-to-end frameworks (Yang et al 2019;Wang et al 2018;Jung et al 2020;Yu, Bansal, and Berg 2017). However, although these methods can produce legitimate stories with high score in automatic metrics like BLEU (Papineni et al 2002), it is shown that the stories tend to be monotonous which contains limited lexical diversity and knowledge (Hsu et al 2019a) [state-of-the-art]The basketball game was intense. The opposing team was very competitive.The game was intense.…”
Section: Introductionmentioning
confidence: 99%
“…So we adapt the few typical works to fit the few-shot setting for comparison. It is noted that, though [10,12] get higher score under the standard setting, we do not compare with them since they have used extra resources such as "Pretrained BERT" and "Knowledge Graph". The descriptions for these models are as follows:…”
Section: Comparison With Sotamentioning
confidence: 99%
“…Compared with it, our model can generate more informative and diverse stories. limited lexical diversity and knowledge (Hsu et al 2019a) (see the example in Figure 1). Recently, two-stage generation methods, also known as plan-write strategy, aroused much research attention in story generation tasks (Yao et al 2019;Martin et al 2017;Ammanabrolu et al 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Recently, two-stage generation methods, also known as plan-write strategy, aroused much research attention in story generation tasks (Yao et al 2019;Martin et al 2017;Ammanabrolu et al 2020). When adopted to the task of VST, Hsu et al (2019a) shows that this strategy is capable of generating more diverse stories compared with end-to-end methods. However, their method directly generates concepts from the images using sequenceto-sequence models.…”
Section: Introductionmentioning
confidence: 99%