Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing 2016
DOI: 10.18653/v1/d16-1128
|View full text |Cite
|
Sign up to set email alerts
|

Neural Text Generation from Structured Data with Application to the Biography Domain

Abstract: This paper introduces a neural model for concept-to-text generation that scales to large, rich domains. It generates biographical sentences from fact tables on a new dataset of biographies from Wikipedia. This set is an order of magnitude larger than existing resources with over 700k samples and a 400k vocabulary. Our model builds on conditional neural language models for text generation. To deal with the large vocabulary, we extend these models to mix a fixed vocabulary with copy actions that transfer sample-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
602
0
3

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 400 publications
(609 citation statements)
references
References 39 publications
4
602
0
3
Order By: Relevance
“…The first shared task at the workshop focused on document-level generation and translation. Many recent attempts at NLG have focused on sentencelevel generation (Lebret et al, 2016;Gardent et al, 2017). However, real world language generation applications tend to involve generation of much larger amount of text such as dialogues or multisentence summaries.…”
Section: Shared Task: Document-levelmentioning
confidence: 99%
“…The first shared task at the workshop focused on document-level generation and translation. Many recent attempts at NLG have focused on sentencelevel generation (Lebret et al, 2016;Gardent et al, 2017). However, real world language generation applications tend to involve generation of much larger amount of text such as dialogues or multisentence summaries.…”
Section: Shared Task: Document-levelmentioning
confidence: 99%
“…Targeting this newly emerging demand, some models have been proposed to respond by generating natural language replies on the y, rather than by (re)ranking a xed set of items or extracting passages from existing pages. Examples are conversational and dialog systems [7,34,54] or machine reading and question answering tasks where the model either infers the answer from unstructured data, like textual documents that do not necessarily feature the answer literally [21,22,46,56], or generates natural language given structured data, like data from knowledge graphs or from external memories [1,18,33,37,40].…”
Section: Objectivesmentioning
confidence: 99%
“…Lebret et al (2016) introduced a neural model for this concept-to-text generation and evaluated on a large dataset of biographies from Wikipedia. Chisholm et.…”
Section: Previous Workmentioning
confidence: 99%