2020
DOI: 10.48550/arxiv.2009.12677
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

KG-BART: Knowledge Graph-Augmented BART for Generative Commonsense Reasoning

Abstract: Generative commonsense reasoning which aims to empower machines to generate sentences with the capacity of reasoning over a set of concepts is a critical bottleneck for text generation. Even the state-of-the-art pre-trained language generation models struggle at this task and often produce implausible and anomalous sentences. One reason is that they rarely consider incorporating the knowledge graph which can provide rich relational information among the commonsense concepts. To promote the ability of commonsen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 32 publications
0
12
0
Order By: Relevance
“…Additionally, the pseudo labels of items provide additional rating information, which can alleviate the cold-start issue of the data. Intuitively, this technique works as a data augmentation method [26,28,59]. We sample those pseudo items and view the difference between rounded ratings with a predicted rating as the randomness, which enhances the robustness of the local model.…”
Section: Local Differential Privacymentioning
confidence: 99%
“…Additionally, the pseudo labels of items provide additional rating information, which can alleviate the cold-start issue of the data. Intuitively, this technique works as a data augmentation method [26,28,59]. We sample those pseudo items and view the difference between rounded ratings with a predicted rating as the randomness, which enhances the robustness of the local model.…”
Section: Local Differential Privacymentioning
confidence: 99%
“…KG-BART [51] first constructs a knowledge subgraph from commonsense KG, and use Glove embedding [60] based word similarity to prune irrelevant concepts. Then graph attention is employed to learn concept embeddings, which are integrated with token embedding using concept-to-subword and subword-to-concept fusion.…”
Section: Commonsense Kg Knowledge Subgraphs Are Also Very Useful For ...mentioning
confidence: 99%
“…KG-BART [51] and [44], via incorporating commonsense knowledge, show better capability on generating more natural and sensible text for the given concept set.…”
Section: Text Generationmentioning
confidence: 99%
See 2 more Smart Citations