Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1614
|View full text |Cite
|
Sign up to set email alerts
|

Generating Highly Relevant Questions

Abstract: The neural seq2seq based question generation (QG) is prone to generating generic and undiversified questions that are poorly relevant to the given passage and target answer. In this paper, we propose two methods to address the issue.(1) By a partial copy mechanism, we prioritize words that are morphologically close to words in the input passage when generating questions;(2) By a QA-based reranker, from the n-best list of question candidates, we select questions that are preferred by both the QA and QG model. E… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 21 publications
0
1
0
Order By: Relevance
“…Question generation (qg) has been found an effective da method in open-domain mrc Chan and Fan, 2019;Lopez et al, 2020). The main reported benefit is that it increases the diversity of questions (Qiu and Xiong, 2019;Sultan et al, 2020). Typically qg models are fed with a snippet s, select an answer span a of s, and generate a question q answered by a.…”
Section: Question Generationmentioning
confidence: 99%
“…Question generation (qg) has been found an effective da method in open-domain mrc Chan and Fan, 2019;Lopez et al, 2020). The main reported benefit is that it increases the diversity of questions (Qiu and Xiong, 2019;Sultan et al, 2020). Typically qg models are fed with a snippet s, select an answer span a of s, and generate a question q answered by a.…”
Section: Question Generationmentioning
confidence: 99%
“…After Du et al (2017) proposed a neural sequence-to-sequence model for QG, neural models that take context and answer as inputs have started to be used to improve question quality with attention (Bahdanau et al, 2014) and copying (Gulcehre et al, 2016;Gu et al, 2016) mechanisms. Most works focused on generating relevant questions from context-answer pairs (Zhou et al, 2018;Kim et al, 2019;Qiu and Xiong, 2019). These works showed the importance of answers as input features for QG.…”
Section: Question Generationmentioning
confidence: 99%