Proceedings of the 2018 Conference of the North American Chapter Of the Association for Computational Linguistics: Hu 2018
DOI: 10.18653/v1/n18-2090
|View full text |Cite
|
Sign up to set email alerts
|

Leveraging Context Information for Natural Question Generation

Abstract: The task of natural question generation is to generate a corresponding question given the input passage (fact) and answer. It is useful for enlarging the training set of QA systems. Previous work has adopted sequence-to-sequence models that take a passage with an additional bit to indicate answer position as input. However, they do not explicitly model the information between answer and other context within the passage. We propose a model that matches the answer with the passage before generating the question.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
121
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 134 publications
(130 citation statements)
references
References 13 publications
0
121
0
Order By: Relevance
“…• PCFG-Trans [12] is a rule-based system that generates a question based on a given answer word span. • MPQG [31] proposed a Seq2Seq model that matches the answer with the passage before generating the question. • SeqCopyNet [43] proposed a method to improve the copying mechanism in Seq2Seq models by copying not only a single word but a sequence of words from the input sentence.…”
Section: Datasets Metrics and Baselinesmentioning
confidence: 99%
“…• PCFG-Trans [12] is a rule-based system that generates a question based on a given answer word span. • MPQG [31] proposed a Seq2Seq model that matches the answer with the passage before generating the question. • SeqCopyNet [43] proposed a method to improve the copying mechanism in Seq2Seq models by copying not only a single word but a sequence of words from the input sentence.…”
Section: Datasets Metrics and Baselinesmentioning
confidence: 99%
“…They adopt a BIO tagging scheme to incorporate the answer position information as learned embedding features in Seq2Seq learning. Song et al (2018) explicitly model the information between answer and sentence with a multiperspective matching model. Kim et al (2019) also focus on the answer information and proposed an answer-separated Seq2Seq model by masking the answer with special tokens.…”
Section: Related Workmentioning
confidence: 99%
“…We retained the same values for most hyperparameters in our experiments as the baseline system (Song et al, 2018).…”
Section: Baseline and Settingsmentioning
confidence: 99%