2019
DOI: 10.1609/aaai.v33i01.33016423
|View full text |Cite
|
Sign up to set email alerts
|

Generating Distractors for Reading Comprehension Questions from Real Examinations

Abstract: We investigate the task of distractor generation for multiple choice reading comprehension questions from examinations. In contrast to all previous works, we do not aim at preparing words or short phrases distractors, instead, we endeavor to generate longer and semantic-rich distractors which are closer to distractors in real reading comprehension from examinations. Taking a reading comprehension article, a pair of question and its correct option as input, our goal is to generate several distractors which are … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
64
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 48 publications
(65 citation statements)
references
References 5 publications
1
64
0
Order By: Relevance
“…They employed the hierarchical encoder-decoder network as the base model and used the dynamic attention mechanism to generate the long distractors. Zhou et al (2019) further strengthened the interaction between the question and the passage based on the model of (Gao et al, 2018).…”
Section: Related Workmentioning
confidence: 93%
See 4 more Smart Citations
“…They employed the hierarchical encoder-decoder network as the base model and used the dynamic attention mechanism to generate the long distractors. Zhou et al (2019) further strengthened the interaction between the question and the passage based on the model of (Gao et al, 2018).…”
Section: Related Workmentioning
confidence: 93%
“…These word-level or entity-level methods can only generate short distractors and do not apply to semantic-rich and long distractors for RACE-like MCQs. Recently, generating longer distractors has been explored in a few studies (Zhou et al, 2019;Gao et al, 2018). For example, Gao et al (2018) proposes a sequence-to-sequence based model, which leverages the attention mechanism to automatically generate distractors from the reading passages.…”
Section: Questionmentioning
confidence: 99%
See 3 more Smart Citations