Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1337
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Learning with Language Modeling for Question Generation

Abstract: This paper explores the task of answer-aware questions generation. Based on the attentionbased pointer generator model, we propose to incorporate an auxiliary task of language modeling to help question generation in a hierarchical multi-task learning structure. Our joint-learning model enables the encoder to learn a better representation of the input sequence, which will guide the decoder to generate more coherent and fluent questions. On both SQuAD and MARCO datasets, our multitask learning model boosts the p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 28 publications
(26 citation statements)
references
References 22 publications
0
24
0
Order By: Relevance
“…• NQG-Knowledge [16], DLPH [12]: auxiliary-informationenhanced question generation models with extra inputs such as knowledge or difficulty. • Self-training-EE [38], BERT-QG-QAP [51], NQG-LM [55], CGC-QG [27] and QType-Predict [56]: multi-task question generation models with auxiliary tasks such as question answering, language modeling, question type prediction and so on.…”
Section: Evaluating Acs-aware Question Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…• NQG-Knowledge [16], DLPH [12]: auxiliary-informationenhanced question generation models with extra inputs such as knowledge or difficulty. • Self-training-EE [38], BERT-QG-QAP [51], NQG-LM [55], CGC-QG [27] and QType-Predict [56]: multi-task question generation models with auxiliary tasks such as question answering, language modeling, question type prediction and so on.…”
Section: Evaluating Acs-aware Question Generationmentioning
confidence: 99%
“…[56] predicts the question type based on the input answer and context. [55] incorporates language modeling task to help question generation. [51] utilizes question paraphrasing and question answering tasks to regularize the QG model to generate semantically valid questions.…”
Section: Related Workmentioning
confidence: 99%
“…Recent neural network-based methods have achieved promising results on QG, most of which are based on the seq2seq attention framework (Du et al, 2017;Gao et al, 2018;Kim et al, 2018;Zhou et al, 2019b), enriched with lexical features Sun et al, 2018;Song et al, 2018) or enhanced by copy mechanism (Du and Cardie, 2018;Sun et al, 2018;Zhou et al, 2019a).…”
Section: Sentencementioning
confidence: 99%
“…For current mainstream neural network-based methods on QG, most approaches utilize the Seq2Seq model with attention mechanism (Du et al, 2017;Zhao et al, 2018b;Zhou et al, 2019a). To obtain better representations of the input sequence and answer, the answer position and token lexical features are treated as supplements for the neural encoder Song et al, 2018;Kim et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation