Rationale and Objectives: Physicians receive little training on proper multiple-choice question (MCQ) writing methods. Well-constructed MCQs follow rules, which ensure that a question tests what it is intended to test. Questions that break these are described as ''flawed.'' We examined whether the prevalence of flawed questions differed significantly between those with or without prior training in question writing and between those with different levels of educator experience. Materials and Methods:We assessed 200 unedited MCQs from a question bank for our senior medical student radiology elective: an equal number of questions (50) were written by faculty with previous training in MCQ writing, other faculty, residents, and medical students. Questions were scored independently by two readers for the presence of 11 distinct flaws described in the literature.Results: Questions written by faculty with MCQ writing training had significantly fewer errors: mean 0.4 errors per question compared to a mean of 1.5-1.7 errors per question for the other groups (P < .001). There were no significant differences in the total number of errors between the untrained faculty, residents, and students (P values .35-.91). Among trained faculty 17/50 questions (34%) were flawed, whereas other faculty wrote 38/50 (76%) flawed questions, residents 37/50 (74%), and students 44/50 (88%). Trained question writers' higher performance was mainly manifest in the reduced frequency of five specific errors.Conclusions: Faculty with training in effective MCQ writing made fewer errors in MCQ construction. Educator experience alone had no effect on the frequency of flaws; faculty without dedicated training, residents, and students performed similarly.Key Words: Multiple-choice questions; educator experience; question flaws; education. ªAUR, 2015P hysicians are rarely trained to properly write multiplechoice examinations, including those working in academic settings. However, this skill set has become much more relevant in recent years. With the transition to the new written format of radiology board certification examinations (1), the development of more rigorous self-assessment requirements for maintenance of certification examinations (2-4), and the greater inclusion of radiology into integrated medical student curricula (5), multiple choice radiology questions are in great demand.Well-constructed multiple-choice questions (MCQs) follow a set of parameters that ensure the question tests what it is intended to test (6-8). Questions that violate widely agreed on rules are described in the education literature as flawed (9-13). In simple terms, a flawed question tends to test ''how good of a test taker'' someone is, rather than the relevant knowledge intended, which can disadvantage some students (10). Previous literature examining MCQs has revealed that such mistakes are common within continuing medical education (CME) materials (14,15) and on health care sciences examinations (10,16).Previous authors have found that MCQ writing is improved after de...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.