2019
DOI: 10.48550/arxiv.1909.03716
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improving Neural Question Generation using World Knowledge

Abstract: In this paper, we propose a method for incorporating world knowledge (linked entities and fine-grained entity types) into a neural question generation model. This world knowledge helps to encode additional information related to the entities present in the passage required to generate human-like questions. We evaluate our models on both SQuAD and MS MARCO to demonstrate the usefulness of the world knowledge features. The proposed world knowledge enriched question generation model is able to outperform the vani… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 18 publications
(36 reference statements)
0
3
0
Order By: Relevance
“…In recent years, because graph neural networks (Kipf et al 2016;Gilmer et al 2017;Hamilton et al 2017) have made great progress in representation learning, many researchers use it to capture the relationship between words in context (Fan et al 2019;Huang et al 2020). Some researchers have also begun to explore the role of GNNs in the QG task.…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, because graph neural networks (Kipf et al 2016;Gilmer et al 2017;Hamilton et al 2017) have made great progress in representation learning, many researchers use it to capture the relationship between words in context (Fan et al 2019;Huang et al 2020). Some researchers have also begun to explore the role of GNNs in the QG task.…”
Section: Related Workmentioning
confidence: 99%
“…• SeqCopyNet [54], NQG++ [53], AFPA [42], seq2seq+z+c+GAN [50], and s2sa-at-mp-gsa [52]: answer-aware neural question generation models based on Seq2Seq framework. • NQG-Knowledge [16], DLPH [12]: auxiliary-informationenhanced question generation models with extra inputs such as knowledge or difficulty. • Self-training-EE [38], BERT-QG-QAP [51], NQG-LM [55],…”
Section: Evaluating Acs-aware Question Generationmentioning
confidence: 99%
“…[22] identifies the content shared by a given question and answer pair as an aspect, and learns an aspect-based question generation model. [16] incorporates knowledge base information to ask questions. Compared with these works, our work doesn't require extra labeling or training overhead to get the training dataset.…”
Section: Related Workmentioning
confidence: 99%