Findings of the Association for Computational Linguistics: EMNLP 2021 2021
DOI: 10.18653/v1/2021.findings-emnlp.347
|View full text |Cite
|
Sign up to set email alerts
|

Attend, Memorize and Generate: Towards Faithful Table-to-Text Generation in Few Shots

Abstract: Few-shot table-to-text generation is a task of composing fluent and faithful sentences to convey table content using limited data. Despite many efforts having been made towards generating impressive fluent sentences by finetuning powerful pre-trained language models, the faithfulness of generated content still needs to be improved. To this end, this paper proposes a novel approach Attend, Memorize and Generate (called AMG), inspired by the text generation process of humans. In particular, AMG (1) attends over … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…As language models have advanced (Devlin et al, 2019;Raffel et al, 2020;Zhao et al, 2023;Dong et al, 2023;Zhao et al, 2021;, numerous efforts have emerged to enhance the performance of MRC-based NER. incorporated different domain knowledge into MRCbased NER task to improve model generalization ability.…”
Section: Related Workmentioning
confidence: 99%
“…As language models have advanced (Devlin et al, 2019;Raffel et al, 2020;Zhao et al, 2023;Dong et al, 2023;Zhao et al, 2021;, numerous efforts have emerged to enhance the performance of MRC-based NER. incorporated different domain knowledge into MRCbased NER task to improve model generalization ability.…”
Section: Related Workmentioning
confidence: 99%
“…However, the input text is insufficient to provide knowledge to generate decent output due to the lack of commonsense, factual events, and semantic information. Knowledge-grounded text generation incorporating external knowledge such as linguistic features (Liu et al, 2021c), knowledge graph (Liu et al, 2021b;Li et al, 2021a), knowledge base (Eric and Manning, 2017;Liu et al, 2022b), and textual knowledge (Liu et al, 2021a;Zhao et al, 2021) help to generate a more logical and informative answer.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, few-shot data-to-text generation [20][21][22][23] has attracted increasing attention. Some incorporate memory module [26,27], while some exploit data augmentation based on existing training set [28,29]. Kasner and Dusek [30] explored a zero-shot setting that utilizes a handcrafted template to transform triples into textual facts and uses neural modules to plan and form the final text.…”
Section: Related Work 21 Data-to-text Generationmentioning
confidence: 99%