2017
DOI: 10.5206/cjsotl-rcacea.2017.1.11
|View full text |Cite
|
Sign up to set email alerts
|

Does Correct Answer Distribution Influence Student Choices When Writing Multiple Choice Examinations?

Abstract: Summative evaluation for large classes of first- and second-year undergraduate courses often involves the use of multiple choice question (MCQ) exams in order to provide timely feedback. Several versions of those exams are often prepared via computer-based question scrambling in an effort to deter cheating. An important parameter to consider when preparing multiple exam versions is that they must be equivalent in their assessment of student knowledge. This project investigated a possible influence of correct a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 23 publications
1
4
0
Order By: Relevance
“…However, the rapid transition of these courses in mid-March 2020 to a 100% virtual learning environment, coupled with an FAS announcement that all further course assessments had to be conducted online, meant some different techniques were required. Necessary adjustments were also recently reported in administration of the 2020 Advanced Placement (AP) chemistry examination, and in assessments set by the American Chemical Society Division of Chemical Education Examinations Institute. , This communication extends previous work published in areas of evaluation in education (particularly multiple-choice testing), and associated academic integrity concerns in the context of short order assessment redesign. , The University of Toronto’s Office of Research Ethics has approved the reporting of anonymous student grade data in terms of secondary use.…”
Section: Introductionsupporting
confidence: 52%
See 1 more Smart Citation
“…However, the rapid transition of these courses in mid-March 2020 to a 100% virtual learning environment, coupled with an FAS announcement that all further course assessments had to be conducted online, meant some different techniques were required. Necessary adjustments were also recently reported in administration of the 2020 Advanced Placement (AP) chemistry examination, and in assessments set by the American Chemical Society Division of Chemical Education Examinations Institute. , This communication extends previous work published in areas of evaluation in education (particularly multiple-choice testing), and associated academic integrity concerns in the context of short order assessment redesign. , The University of Toronto’s Office of Research Ethics has approved the reporting of anonymous student grade data in terms of secondary use.…”
Section: Introductionsupporting
confidence: 52%
“…One important feature of the CHM 136H and CHM 247H multiple-choice final assignments was the “scrambling” of questions, which was straightforward to arrange through Canvas. This strategy has been reported previously, along with others such as multiple-choice examination “personalization”, and was managed differently in each course. For CHM 136H, the final assignment was created with 30 question groups and four “versions” of each question as a method of minimizing potential academic integrity issues. One question was randomly selected from each group, so the 917 students writing the assignment would largely have unique versions of it.…”
Section: Online Final Assignment Deliverymentioning
confidence: 99%
“…This keying bias reveals that test developers do not balance (or randomize) the position of answer keys in tests, ignoring guidelines provided by item-writing guides for decades now (Trump and Haggerty, 1952;Haladyna and Downing, 1989a;Haladyna et al, 2002;Haladyna and Rodriguez, 2013). Implications for the validity of test scores may be critical: if test takers become aware that answer keys are more frequent among middle options, they can develop position-based strategies to make more accurate guesses and provide correct responses by selecting more central positions (Bar-Hillel and Attali, 2002;Bar-Hillel et al, 2005).…”
Section: Discussionmentioning
confidence: 99%
“…This was done by conducting Pearson Chi-Square Tests of Independence, one for each of the five types of options, namely, CR, MAD, Distractor 2, Distractor 3, and Distractor 4. The five resulting chi-square tests were all based on 2×2×2 contingency tables, with two levels for CR position (independent variable), two levels for MAD distance (independent variable), and two levels for option selection (dependent variable, either 1 or 0 depending on whether the option had been selected or not, see Carnegie, 2017 for an example of frequency analysis). Target Items 2 to 5 were entered as different layers in these analyses because we wanted to analyze option position effects for each item individually to evaluate findings’ replicability.…”
Section: Methodsmentioning
confidence: 99%
“…One of the items’ features that can bias responses to multiple-choice items is the position of response options. Students do pay attention to options position (Carnegie, 2017) and avoid producing atypical answer sequences when choosing responses to consecutive items (Lee, 2019). Also, they have been shown to be unconsciously influenced by option position (Attali & Bar-Hillel, 2003).…”
Section: Introductionmentioning
confidence: 99%