Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-3030
|View full text |Cite
|
Sign up to set email alerts
|

ParaQG: A System for Generating Questions and Answers from Paragraphs

Abstract: Generating syntactically and semantically valid and relevant questions from paragraphs is useful with many applications. Manual generation is a labour-intensive task, as it requires the reading, parsing and understanding of long passages of text. A number of question generation models based on sequence-to-sequence techniques have recently been proposed. Most of them generate questions from sentences only, and none of them is publicly available as an easy-to-use service. In this paper, we demonstrate ParaQG, a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 19 publications
0
10
0
Order By: Relevance
“…For QG specifically, building on early research with attention and the basic encoder-decoder setup (Zhou et al 2018), recent works have started exploring transformers (Chan and Fan 2019), variational encoders (Lee et al 2020), reinforcement learning (Wang et al 2020b), semantic information (Pan et al 2020) and future n-gram prediction (Qi et al 2020). However, research addressing content selection and answer unaware QG is still preliminary with some previous works employing supervision for training an answer span selection module alongside QG (Du and Cardie Subramanian et al 2018) or by simply treating noun phrases and named entities as potential answer cues for QG (Lewis, Denoyer, and Riedel 2019;Kumar et al 2019).…”
Section: Related Workmentioning
confidence: 99%
“…For QG specifically, building on early research with attention and the basic encoder-decoder setup (Zhou et al 2018), recent works have started exploring transformers (Chan and Fan 2019), variational encoders (Lee et al 2020), reinforcement learning (Wang et al 2020b), semantic information (Pan et al 2020) and future n-gram prediction (Qi et al 2020). However, research addressing content selection and answer unaware QG is still preliminary with some previous works employing supervision for training an answer span selection module alongside QG (Du and Cardie Subramanian et al 2018) or by simply treating noun phrases and named entities as potential answer cues for QG (Lewis, Denoyer, and Riedel 2019;Kumar et al 2019).…”
Section: Related Workmentioning
confidence: 99%
“…Also, systems like [46] introduce a pair-to-sequence model that captures the interaction between the question asked and the given paragraph. Specific systems like ParaQG [20] try to generate the questions from the paragraph. Systems like [35] pick up the keywords from the question and paragraph and match them using RNN.…”
Section: Related Workmentioning
confidence: 99%
“…Most of these state-of-the-art models for question generation use a single sentence as input to generate questions which is not useful for generating questions of educational content available as pre-recorded videos. ParaQG [25], an advanced NLP model is introduced to tackle such problems and capable of generating questions from paragraphs. ParaQG is based on a combination of NLP techniques such as Seq2Seq model with dynamic dictionaries, the copy mechanism and the global sparse-max attention.…”
Section: Ai-based Automatic Question Generation (Aqg)mentioning
confidence: 99%
“…VidVersityQG generates short-answer questions from the two input kinds: (1) the target segment chosen by a teacher in the second phase, and (2) the keywords chosen in the third phase. We employ an advanced NLP model that uses a deep learning neural network, named ParaQG [25], to automatically generate short-answer questions. ParaQG can generate fluent, meaningful and relevant questions from a paragraph with a given target answers.…”
Section: Short-answer Question Generationmentioning
confidence: 99%