Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.226
|View full text |Cite
|
Sign up to set email alerts
|

INSET: Sentence Infilling with INter-SEntential Transformer

Abstract: Missing sentence generation (or sentence infilling) fosters a wide range of applications in natural language generation, such as document auto-completion and meeting note expansion. This task asks the model to generate intermediate missing sentences that can syntactically and semantically bridge the surrounding context. Solving the sentence infilling task requires techniques in natural language processing ranging from understanding to discourselevel planning to generation. In this paper, we propose a framework… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
23
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 26 publications
(23 citation statements)
references
References 34 publications
0
23
0
Order By: Relevance
“…C ⊥ ⊥ T: Story generation (Fan et al, 2019), text infilling (Fedus et al, 2018;Huang et al, 2019), paragraph bridging (Kang et al, 2019), and our proposed PARCOM are very challenging tasks where context and target have no overlap (open-ended), but they should be coherently connected. Keskar et al (2019) conditioned language models with topical words to control the target text.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…C ⊥ ⊥ T: Story generation (Fan et al, 2019), text infilling (Fedus et al, 2018;Huang et al, 2019), paragraph bridging (Kang et al, 2019), and our proposed PARCOM are very challenging tasks where context and target have no overlap (open-ended), but they should be coherently connected. Keskar et al (2019) conditioned language models with topical words to control the target text.…”
Section: Related Workmentioning
confidence: 99%
“…However, they are given the topical content as input (content guidance), while our SSPlanner directly predicts plan words from context (content prediction). Fedus et al (2018); Huang et al (2019) developed various methods for text infilling task. Very similar to our task, Kang et al (2019) developed language models informed by discourse relations on the bridging task; given the first and last sentences, predicting the intermediate sentences (bidirectional flow).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…For example, inspired by skipgrams (Mikolov et al, 2013), proposed to train a sequence-to-sequence model to generate sentences before and after a sentence, and use the encoder to compute sentence representations. Inspired by masked language modeling in BERT, Zhang et al (2019) and Huang et al (2020) presented methods to learn contextualized sentence representations through the task of restoring a masked sentence from its context.…”
Section: Introductionmentioning
confidence: 99%