2021
DOI: 10.1007/s40670-021-01305-y
|View full text |Cite
|
Sign up to set email alerts
|

Examining Bloom’s Taxonomy in Multiple Choice Questions: Students’ Approach to Questions

Abstract: Background Analytic thinking skills are important to the development of physicians. Therefore, educators and licensing boards utilize multiple-choice questions (MCQs) to assess these knowledge and skills. MCQs are written under two assumptions: that they can be written as higher or lower order according to Bloom’s taxonomy, and students will perceive questions to be the same taxonomical level as intended. This study seeks to understand the students’ approach to questions by analyzing differences … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(21 citation statements)
references
References 17 publications
0
20
0
1
Order By: Relevance
“…Combining VR and AR for teaching the same structure(s) was found to help students develop a deeper understanding in anatomy (Deng et al, 2018), indicating a unique role of multimodal digital resources in anatomy. Therefore, to gauge the impact of flipped anatomy classroom integrating multimodal digital resource on learning performance such as learning progress, in the present study, a Bloom's taxonomy‐based assessment strategy that stratifies assessment activities into different cognitive levels (Thompson & O'Loughlin, 2015; Zaidi et al, 2018; Stringer et al, 2021) was adopted to design MCQ questions at three categories (Barkley, 2015): “remembering” that focused on knowing the anatomy facts, “understanding” that focused on contrasting and comparing anatomy knowledge, and “applying” that focused on pushing clinical reasoning and critical thinking skills. Analysis of the mid‐ and end of semester quizzes outcomes showed no significant differences in students' performance relating to the “remembering” and “understanding” questions for both units.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Combining VR and AR for teaching the same structure(s) was found to help students develop a deeper understanding in anatomy (Deng et al, 2018), indicating a unique role of multimodal digital resources in anatomy. Therefore, to gauge the impact of flipped anatomy classroom integrating multimodal digital resource on learning performance such as learning progress, in the present study, a Bloom's taxonomy‐based assessment strategy that stratifies assessment activities into different cognitive levels (Thompson & O'Loughlin, 2015; Zaidi et al, 2018; Stringer et al, 2021) was adopted to design MCQ questions at three categories (Barkley, 2015): “remembering” that focused on knowing the anatomy facts, “understanding” that focused on contrasting and comparing anatomy knowledge, and “applying” that focused on pushing clinical reasoning and critical thinking skills. Analysis of the mid‐ and end of semester quizzes outcomes showed no significant differences in students' performance relating to the “remembering” and “understanding” questions for both units.…”
Section: Discussionmentioning
confidence: 99%
“…The mid‐semester test focused on assessing content delivered during the first half of the semester while the end of semester test focused on content taught for the rest of the semester. Adopting the taxonomy‐based assessment strategy (Thompson & O'Loughlin, 2015; Zaidi et al, 2017; Zaidi et al, 2018; Stringer et al, 2021), MCQs were designed to target knowledge acquisition at three levels (Barkley, 2015): “remembering” that focused on memorizing anatomy facts and landmarks, “understanding” that focused on understanding the surface and internal structures relative to surroundings and associated functions, and “applying” that focused on the structural and functional application of the assessed structure(s) in a clinical scenario. The first‐year neuroanatomy unit tests comprised 40%, 40%, and 20% of MCQs targeting knowledge acquisition at “remembering”, “understanding” and “applying”, respectively, while the third‐year regional anatomy unit tests comprised 20%, 40%, and 40% of MCQs targeting knowledge acquisition at “remembering”, “understanding” and “applying”, respectively.…”
Section: Methodsmentioning
confidence: 99%
“…On the other hand, students will have less well developed illness scripts, have a more rudimentary organisation of events, and may rely more on prototypical descriptions of individual signs and symptoms when reaching an answer (Schmidt and Rikers, 2007;Custers, 2015). Therefore when a student's knowledge is based mainly on learning prototypical descriptions (using keywords or buzzwords) rather than clinical experience, their pattern recognition of a given condition is likely to be less developed so they lack awareness of the variability in disease presentation seen in the real world (Stringer et al, 2021). This is exempli ed by Item 12 (Fig.…”
Section: Discussionmentioning
confidence: 99%
“…The questions commonly used by educators to measure the cognitive domain are limited response types in the form of multiple choice questions and essay questions. Multiple-choice test questions can be used to measure learning outcomes that are more complex and related to aspects of memory, understanding, application, analysis and evaluation (Stringer et al, 2021). Multiple-choice test questions consist of the subject matter carrier and answer choices.…”
Section: Discussionmentioning
confidence: 99%