Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics 2023
DOI: 10.18653/v1/2023.eacl-main.230
|View full text |Cite
|
Sign up to set email alerts
|

Closed-book Question Generation via Contrastive Learning

Xiangjue Dong,
Jiaying Lu,
Jianling Wang
et al.

Abstract: Question Generation (QG) is a fundamental NLP task for many downstream applications.Recent studies on open-book QG, where supportive answer-context pairs are provided to models, have achieved promising progress. However, generating natural questions under a more practical closed-book setting that lacks these supporting documents still remains a challenge. In this work, we propose a new QG model for this closed-book setting that is designed to better understand the semantics of long-form abstractive answers and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 45 publications
0
3
0
Order By: Relevance
“…As language models have advanced (Devlin et al, 2019;Raffel et al, 2020;Zhao et al, 2023;Dong et al, 2023;Zhao et al, 2021;, numerous efforts have emerged to enhance the performance of MRC-based NER. incorporated different domain knowledge into MRCbased NER task to improve model generalization ability.…”
Section: Related Workmentioning
confidence: 99%
“…As language models have advanced (Devlin et al, 2019;Raffel et al, 2020;Zhao et al, 2023;Dong et al, 2023;Zhao et al, 2021;, numerous efforts have emerged to enhance the performance of MRC-based NER. incorporated different domain knowledge into MRCbased NER task to improve model generalization ability.…”
Section: Related Workmentioning
confidence: 99%
“…To make the model generate predictions independent of biased attributes, it is important for sentences with similar semantics but along different bias directions to be closer (Cheng et al, 2021;He et al, 2022). We apply contrastive learning, of which the objective is to obtain meaningful representations by bringing semantically similar neighbors closer and pushing apart the dissimilar neighbors (Gao et al, 2021;Dong et al, 2023b;Li et al, 2023). In this work, input sentence s i and its counterpart s ′ i are semantically related but in opposite bias directions.…”
Section: Co 2 Pt: Debiasing Via Counterfactual Contrastive Prompt Tuningmentioning
confidence: 99%
“…The objective of Question Generation is to produce well-structured, coherent, and valuable questions that correspond to a specific context passage and the intended answer. QG systems play a vital upstream role in enhancing the robustness and generalizability of Question Answering (QA) and Machine Reading Comprehension (MRC) models Dong et al, 2023), empowering chatbots and virtual assistants to answer more user needs (Gottardi et al, 2022), and powering AI-driven tutoring systems for educational purposes (Kurdi et al, 2019). For most existing QG systems, extracting qualified candidate answers from the context passage is an indispensable prerequisite to ensure that the generated questions are of high quality and relevant to the user's interests of salient information contained in the context passage.…”
Section: Introductionmentioning
confidence: 99%