Proceedings of the 2nd Workshop on Representation Learning for NLP 2017
DOI: 10.18653/v1/w17-2603
|View full text |Cite
|
Sign up to set email alerts
|

Machine Comprehension by Text-to-Text Neural Question Generation

Abstract: We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers. We show how to train the model using a combination of supervised and reinforcement learning. After teacher forcing for standard maximum likelihood training, we fine-tune the model using policy gradient techniques to maximize several rewards that measure question quality. Most notably, one of these rewards is the performance of a questionanswering system. We motivate question generation as a mea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
105
0
2

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 143 publications
(107 citation statements)
references
References 39 publications
(61 reference statements)
0
105
0
2
Order By: Relevance
“…Purely relying on Seq2Seq model may not be able to learn such a one-to-many mapping [9]. To resolve this issue, recent works assume the aspect is known when generating a question [16,31,32,38,42] or can be detected by a third-party pipeline [7]. [42] enriches the sequence-to-sequence model with answer position indicator to indicate if the current word is an answer word or not, and further incorporates copy mechanism to copy words from the context when generating a question.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Purely relying on Seq2Seq model may not be able to learn such a one-to-many mapping [9]. To resolve this issue, recent works assume the aspect is known when generating a question [16,31,32,38,42] or can be detected by a third-party pipeline [7]. [42] enriches the sequence-to-sequence model with answer position indicator to indicate if the current word is an answer word or not, and further incorporates copy mechanism to copy words from the context when generating a question.…”
Section: Related Workmentioning
confidence: 99%
“…As a dual task of question answering, question generation based on a text passage and a given answer has attracted much attention in recent years. One of the key applications of question generation is to automatically produce question-answer pairs to enhance machine reading comprehension systems [8,34,35,38]. Another application is generating practice exercises and assessments for educational purposes [4,12,13].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This need to generate multiple questions for a sentence motivates our use of an answer signal. The model described by Yuan et al (2017) also uses an answer signal feature. However, by combining it with additional features and the question specific sentence encoder our model achieves better results, as we show in Section 3.…”
Section: Feature Supervisionmentioning
confidence: 99%
“…Questions are usually asked to access knowledge of others or direct one's own information-seeking behavior. According to authors in [27], the incentives to teach machines to ask questions are:…”
Section: Joint Model Based Training To Improve Performance Of Neumentioning
confidence: 99%