Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.340
|View full text |Cite
|
Sign up to set email alerts
|

Improving Unsupervised Question Answering via Summarization-Informed Question Generation

Abstract: Question Generation (QG) is the task of generating a plausible question for a given pair. Template-based QG uses linguistically-informed heuristics to transform declarative sentences into interrogatives, whereas supervised QG uses existing Question Answering (QA) datasets to train a system to generate a question given a passage and an answer. A disadvantage of the heuristic approach is that the generated questions are heavily tied to their declarative counterparts. A disadvantage of the super… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 26 publications
(25 citation statements)
references
References 28 publications
(42 reference statements)
0
25
0
Order By: Relevance
“…Future work should seek to further explore the relationship between summarization and QG. Work done concurrently to ours by Lyu et al (2021) already has promising results in this direction, showing that training a QG model on synthetic data from summarized text improves performance on downstream QA.…”
Section: Discussionmentioning
confidence: 90%
“…Future work should seek to further explore the relationship between summarization and QG. Work done concurrently to ours by Lyu et al (2021) already has promising results in this direction, showing that training a QG model on synthetic data from summarized text improves performance on downstream QA.…”
Section: Discussionmentioning
confidence: 90%
“…It is explored as a standalone task (Heilman and Smith, 2009;Nema et al, 2019), as a pre-training task for language models (Narayan et al, 2020) and as a component in solutions for other textual tasks, such as question answering Puri et al, 2020), information retrieval (Mass et al, 2020;Gaur et al, 2021) and generation evaluation (Durmus et al, 2020;Honovich et al, 2021). There are two main directions to QG: template-based (Heilman and Smith, 2009;Lyu et al, 2021;Dhole and Manning, 2020) and neural-based, with the latter achieving state-of-the-art results Narayan et al, 2020).…”
Section: Vq 2 Amentioning
confidence: 99%
“…Similarly, paraphrasing also demonstrates students' content comprehension skills (Haynes and Fillmer, 1984) since it requires the ability of conveying the same semantic meaning with different languages. However, both tasks are rarely studied for question generation (Lyu et al, 2021). Simplification, on the other hand, has demonstrated its values in language learning and reading comprehension (Tweissi, 1998;Inui et al, 2003;Petersen and Ostendorf, 2007;Rets and Rogaten, 2021), and is often treated as a preprocessing step to convert complex sentences into simpler versions before creating questions (Majumder and Saha, 2015;Patra and Saha, 2019).…”
Section: Nlp Tasks For Educationmentioning
confidence: 99%