Introduction Medical imaging is the primary approach to visualize normal and pathological anatomy in clinical practice and is used in almost every specialty. As such, it is imperative that medical students learn how to view and interpret radiological images early in training. At the Medical College of Georgia, medical imaging is integrated into first year anatomy lectures and labs and tested on written exams and lab practicals. Aim The aim of this study was to determine student study strategies for radiology and evaluate their effectiveness. We hypothesized that students utilizing active learning strategies would be more effective in learning and retaining radiology content. Methods Following completion of the first year curriculum, medical students (n=140 out of 191) completed a survey to ascertain radiology study strategies. Students also completed a radiology formative assessment that included 10 questions from previous lab practicals. Data from anatomy practicals throughout the year were analyzed. Students were split into quartiles (n=35) based on overall Anatomy grade (1st quartile: 88.6 ± 2.5; 2nd quartile: 82.7 ± 1.2; 3rd quartile: 78.7 ± 1.2; 4th quartile: 72.7 ± 3.7). One‐way ANOVA with Tukey’s post‐hoc analysis and unequal variance two tailed t‐tests were used to compare study strategies and grades among quartiles. IRB approval was obtained from Augusta University. Results The most commonly cited study strategies included attending faculty reviews (70%), individual studying (64%), using pre‐labeled 2D images instead of 3D image stacks (55%), and using radiology websites (51%). Students scored significantly lower on the end of year questions (38.1 ± 2.1) compared to performance on lab practicals during the year (86.2 ± 1.2) (p<0.05). Students in the 1st and 2nd quartiles scored significantly higher on select radiology items from lab practicals throughout the year compared to the 4th quartile (1st: 92.6 ± 9.4, 2nd: 91.1 ± 6.7, 4th: 75.7 ± 14.0; p<0.05). End of year radiology assessment scores were also significantly higher in the 1st and 2nd quartiles compared to the 4th quartile (1st: 50.0 ± 19.6, 2nd: 42.6 ± 17.3, 4th: 28.0 ± 16.2; p<0.05). Students in the 1st quartile used active learning study strategies such as 3D images stacks (40% ± 4.9%) and practice questions (51.4% ± 5.0%) significantly more often than students in the 4th quartile (11.4% ± 3.2%; 20.0% ± 4.0% respectively) (p<0.05). Conclusions These data show that current study strategies used by students do not promote long term retention of radiology content; however, those students using active learning study strategies retain more. Although top quartile students scored significantly better on the end of year radiology assessment, they only retained ~50% of radiology material. Based on the results of this study, the radiology curriculum has been revised to address concerns about long term retention. These changes include more frequent exposure to radiological images and more formative assessments that require students to scroll through 3D ima...
IntroductionEnsuring that medical students are challenged with critical thinking problems and can solve them successfully is important to their development as future physicians. Multiple‐choice questions (MCQs), if written correctly, can promote the development of these critical thinking skills. Bloom's taxonomy is a framework used to classify cognitive skills into increasingly complex levels of learning (Remember, Understand, Apply, Analyze, Evaluate, and Create). Adapted versions of Bloom's taxonomy have been used to categorize MCQs into lower order, non‐critical thinking questions and higher‐order, critical thinking questions.Purpose and HypothesisThe purpose of this study was to evaluate the distribution of Gross Anatomy and Development MCQs categorized as either non‐critical thinking questions (Remember or Understand) or critical thinking questions (Apply/Analyze) and analyze student performance on those questions. We hypothesized that students would perform better on non‐critical thinking MCQs compared to critical thinking MCQs.MethodsAt the Medical College of Georgia, the first year medical curriculum consists of systems‐based modular blocks composed of basic science components including Gross Anatomy and Development. Academic performance is measured primarily from MCQ exams for each module. Gross Anatomy and Development MCQs (n=260) from the 2016–2017 academic year were evaluated and sorted into Remember, Understand, or Apply/Analyze categories by four evaluators. Interrater reliability was calculated using Krippendorff's α (α=0.54). Student academic performance (n=192 students) on categorized MCQs was analyzed and compared using a one‐way ANOVA and Tukey's post‐hoc analysis.ResultsOf the 151 Anatomy questions, 20.5% were categorized as Remember, 51% as Understand, and 28.5% as Apply/Analyze. Students scored similarly on all three categories of Anatomy MCQs (Remember: 77.3% ± 18%; Understand: 82.3% ± 13%; Apply/Analyze: 78.7% ± 16%; p=0.202). Of the 109 Development questions, 33% were categorized as Remember, 39.4% as Understand, and 27.5% as Apply/Analyze. Students performed significantly better on Development MCQs categorized as Understand compared to Remember (81.1% ± 11.5% vs. 72.4% ± 15.5%; p=0.015). There was no difference between Understand and Apply/Analyze MCQs (74.6% ± 13.2%; p=0.109) or Remember and Apply/Analyze MCQs (p=0.799).ConclusionStudents performed similarly across all categories of Anatomy MCQs. However, for Development, students performed significantly better on Understand MCQs compared to Remember MCQs. Though Remember questions are one‐step and straightforward, they may be difficult if they involve specific isolated facts, unlike broader concepts tested by Understand questions. Apply/Analyze questions involve multiple steps or require students to connect several pieces of information to conclude the correct answer, which may make them more difficult for students. Overall, fewer MCQs were categorized as critical thinking than expected, as it was initially thought that the majority of MCQs in the curriculum would involve higher‐order reasoning skills. However, classification of MCQs by evaluators was challenging because students may approach questions differently based on prior knowledge and study resources used. This study identified opportunities to improve assessments by incorporating more MCQs that test higher order critical thinking skills.This abstract is from the Experimental Biology 2018 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.
Introduction Poor retention of medical knowledge is a concern within medical education. While studies show that student retention from basic science disciplines often follows the “forgetting curve”, histology retention has not been examined independently of other anatomical sciences. Investigation of histology retention is of increasing importance as medical education moves towards integrated curricula with the use of technological advances in the classroom, such as virtual microscopy. The purpose of this study was to evaluate histology retention of first‐year students at the Medical College of Georgia (MCG). Aims The specific aims of this study include evaluating the relationships of 1) histology retention and academic performance 2) retention intervals (RI) to histology retention and 3) histology retention and students’ previous exposure to histology as well as their modality of study. Methods Academic performance data from histology quizzes and exams were collected from first‐year medical students at MCG from the Class of 2022 (n=171). A histology comprehensive assessment was administered at the end of the academic year to assess histology knowledge retained throughout first‐year histology curriculum. Students were also surveyed on their prior histology experience and study method modality. A linear regression analysis was performed to determine if there was a correlation between academic performance and retention. A comparison of means was used to assess the relationship between histology retention scores compared to academic performance in terms of RI, histology exposure, and modality of study. Paired sample t‐tests were used for analyses. IRB approval (exempt) was obtained from Augusta University. Results First‐year medical students at MCG were found to only retain 52.4% ± 17.0% of histology content on the end of year comprehensive assessment. Academic performance in histology did not predict retention at the end of the academic year (R=0.27). Student retention dropped on average from 84% to 52% regardless of RI length (2, 3, 5, or 6 months). No significant difference was found between students with prior histology exposure (85.9% ± 4.9%) and those without (84.2% ± 5.3%) on overall histology grade averages. However, those with prior histology experience did score significantly better on the comprehensive assessment (58.5% ± 15.3% vs. 51.2% ± 17.2%; p=0.04). No significant difference was seen on average histology grades (84.5% ± 4.2%, 84.2% ± 16.6%) or comprehensive assessment performance (52.9% ± 6.9%, 51.4% ± 17.8%) when study modalities (virtual microscopy vs physical slides) were compared. Discussion and Conclusions This data supports previously reported findings that medical students retain on average ~50% of their basic science knowledge. These findings also demonstrate that academic performance is not a predictor of retention. Furthermore, RI and method of study appear to have no significant impact on histology retention. However, prior histology experience appears to aid in retention and suggest that...
Introduction Metacognition refers to awareness of one's own knowledge and is fundamental for physicians to have. It encompasses how one plans when approaching a learning task, monitors comprehension, and reflects on understanding and performance. As medicine evolves, it is critical that medical students and physicians have insight into their limitations and knowledge deficiencies. Diagnostic and treatment errors are one of the leading causes of death, and overconfidence, or a lack of metacognition, is often to blame. Metacognitive skills must be developed not only to achieve higher academic performance, but also to link medical knowledge to clinical thinking, emphasize critical thinking, and enhance reflection. Aims The purpose of this study was to examine metacognitive skills of first year medical students and evaluate the influence these skills have on academic performance. We hypothesized that there would be a significant relationship between higher metacognition and academic performance. Methods At the end of the first year, Medical College of Georgia students (n=119 out of 191) completed a survey consisting of the Metacognitive Awareness Inventory (MAI), which assesses knowledge and regulation about cognition (possible score: 61–305), and questions about study strategies. Students were divided into tertiles based on metacognition results (low: 160.8 ± 13.1, n=41; middle: 185.5 ± 5.6, n=39; high: 214.7 ± 15.7, n=39). One‐way ANOVA with Tukey’s posthoc analysis was used to compare metacognition scores with academic performance in basic science components (Anatomy, Biochemistry, Development, Histology, Physiology, Neuroscience), and systems‐based modules (Cell & Molecular Basis of Medicine, Tissue/Musculoskeletal, Cardiopulmonary, Gastrointestinal‐Nutrition, Genitourinary, Head/Neck & Special Senses, Medical Neuroscience & Behavioral Health), as well as the frequency of study strategies used. IRB approval was obtained from Augusta University. Results Students with high metacognition scores performed significantly better in the curriculum (overall grade: 86.5 ± 5.1) than the middle (overall grade: 83.6 ± 4.8) and low tertiles (overall grade: 81.2 ± 5.0) (p<0.05). The highest metacognitive tertile scored significantly higher in every basic science component compared to the lowest tertile (p<0.05) and significantly better than the middle tertile for all of the anatomical science components (p<0.05). Medical students in the highest metacognitive tertile performed significantly higher than the lowest tertile in every systems‐based module except the first, Cell & Molecular Basis of Medicine (p<0.05). The highest metacognitive tertile was significantly more likely to peer teach and take notes from memory than the lowest tertile (p<0.05). Additionally, a significant difference in frequency of reviewing at spaced intervals existed between all tertiles (p<0.05). Conclusions This preliminary study found that higher metacognition positively impacts academic performance and should be actively fostered. Medical students woul...
Introduction Medical students often experience some level of anxiety during their training, particularly concerning exams. In addition to tests within the medical curriculum, students and residents must pass high stakes United States Medical Licensing Examination (USMLE) standardized exams to earn their degree and for licensure. Test anxiety can severely impact a person’s ability to perform. Identification of those who struggle with test anxiety early in their training may be useful in implementing targeted interventions that will be beneficial beyond graduation from medical school. The Westside Test Anxiety Scale (WTAS) is an assessment that measures performance impairment and worry as these factors relate to exams. Aim The aim of this study was to examine how test anxiety, as determined by the WTAS assessment, contributes to academic performance of first year medical students. Based on previous studies, we hypothesized that overall test anxiety would negatively correlate with academic performance. Methods At the Medical College of Georgia, medical students (n =131 out of 191) completed a preclinical anxiety questionnaire at the end of the first year curriculum. The survey consisted of the 10‐question WTAS along with Likert scale and narrative response questions assessing students’ experiences with anxiety in the curriculum. Students were subdivided into tiers of low anxiety (WTAS range: 1.0–1.9; n=33), normal anxiety (WTAS range: 2.0–2.9; n=75), and high anxiety (WTAS range: 3.0–5.0; n=23). Academic performance from multiple choice quizzes and exams, fill‐in‐the‐blank lab practicals, and overall weighted first year average were compared between low, normal, and high anxiety cohorts using one‐way ANOVA and Tukey’s post‐hoc analysis. IRB approval was obtained from Augusta University. Results When subdivided into tiers of low anxiety (avg. WTAS score 1.6 ± 0.3), normal anxiety (avg. WTAS score 2.5 ± 0.3), and high anxiety (avg. WTAS score 3.4 ± 0.4), students with high anxiety had a significantly lower overall grade (80.5 ± 5.5) than students with normal (84.1 ± 4.9) or low anxiety (85.5 ± 4.2) (p<0.05). Anatomy, histology, and neuroanatomy lab practical averages were not significantly different among students in the three WTAS tiers. However, performance on gross anatomy and histology multiple choice questions on quizzes (Q) and module exams (ME) were significantly lower for students with high anxiety (Q: 77.5 ± 7.5; ME: 78.1 ± 8.2) than students with low (Q: 83.4 ± 6.4; ME: 85.9 ± 5.2) or normal (Q: 82.4 ± 8.0; ME: 84.7 ± 6.9) anxiety (p<0.05). Conclusions This preliminary study found that increased test anxiety has a negative association with academic performance in the first year curriculum. Test anxiety negatively impacts performance on gross anatomy and histology written multiple choice exam questions but not fill‐in‐the‐blank timed station lab practicals, suggesting that modality of question and/or exam type may negate the effect of anxiety on exam performance. Future studies will examine whether other fact...
IntroductionGross anatomy courses include large volumes of information that necessitate unique learning and study strategies for students to become proficient with the material. However, students often enter health professional schools with inadequate and inefficient study strategies, which place them at risk for academic difficulty in these courses. Early identification of poor learning and study strategies may allow for instructor interventions that significantly improve a student's future performance. The Learning and Study Strategies Inventory (LASSI) is an assessment that measures students' learning and study practices as well as attitudes via subscales related to skill (information processing, selecting main ideas, and test strategies), will (attitude, motivation, and anxiety), and self‐regulation (concentration, time management, self‐testing, and using academic resources) components of strategic learning. LASSI subscales have been correlated with academic performance of chiropractic and medical students, but little is known about how they correlate with academic performance of students in other health professions programs.Purpose & HypothesisThe purpose of this study was to examine whether learning and studying strategies as determined by the LASSI assessment contribute to and/or predict academic performance of allied health students taking gross anatomy. Based on previous studies, we hypothesized that the LASSI subscales: test strategies, motivation, anxiety, concentration, and time management would be positive predictors of allied health student performance in gross anatomy.MethodsAt the beginning of the 9‐week gross anatomy course, allied health (Physical Therapy, Occupational Therapy, and Physician Assistant) students (n=117 out of 130) completed the LASSI survey, a 10‐scale, 60‐item online assessment of students' self‐awareness about and use of learning & study strategies. Allied health students were divided into tertiles (n=39 students each) based on final grades in the course (Low: grades 65.8–81.8; Middle: grades 82.5–90.1; High: grades 90.3–98.1). LASSI subscales between low, middle, and high cohorts were compared using a one‐way ANOVA and Tukey's post‐hoc analysis. Linear regression analyses were performed on each LASSI subscale to determine if there was a correlation to final grade.ResultsLASSI subscales for anxiety (β=0.453), attitude (β=0.484), concentration (β=0.388), motivation (β=0.488), test strategies (β=0.633), and time management (β=0.363) were significant predictors of overall performance in the class (p<0.05). LASSI subscales for anxiety (High: 21.05 vs. Low: 18.0), test strategies (High: 23.38 vs. Low: 24.41), and time management (High: 21.97 vs. Low: 19.56) were significantly different between low and high performing cohorts of students (p<0.05).ConclusionsThis preliminary study found that LASSI subscales for anxiety, test strategies, and time management were predictive of the final gross anatomy course performance of the allied health students. The fast‐paced nature of the course may lend itself to increased anxiety and difficulty with time management for students. Students who already have difficulties in these areas and with test‐taking strategies are at risk for poor performance. These results may be useful in creating early targeted interventions, such as counseling or anxiety and time management seminars, to help educators provide assistance for struggling students.This abstract is from the Experimental Biology 2018 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.
IntroductionEarly identification of students struggling with gross anatomy due to inadequate study strategies is critical to prevent poor course outcomes and repetition of the course. Students often enter health professional schools with inadequate and inefficient study strategies. Faculty and teaching assistants give students advice at the beginning of the course about effective study strategies, but it is unclear if and when students implement those strategies. Few studies have addressed how to quickly identify at‐risk students and how to remediate their difficulties prior to the end of a course.Purpose and HypothesisThe purpose of this project was to assess the effectiveness of formative practice written and lab exams to identify students at academic risk early in the learning process and prompt students to examine their study strategies. We hypothesized that performance on these formative assessments would predict future academic performance on subsequent exams. We also hypothesized that students would recognize early difficulties and adjust study habits and strategies accordingly.MethodsAllied Health (Occupational Therapy, Physical Therapy, and Physician Assistant) students enrolled in a 9‐week Gross Anatomy course during the summer of 2017 were included in the study (n=129). Following the first week of class, practice written and lab practical exams were administered to students for formative feedback. Students were prompted to complete a short study habits questionnaire prior to the start of class, following each exam, and at the end of the course. Academic performance was calculated from scores on three written and lab exams. Linear regression analyses were used to compare formative practice exam performance with future academic performance. Paired T‐tests and multiple regression analysis were used to compare survey responses throughout the course. Survey responses were themed and quantified for analysis.ResultsThe presence of formative practice exams did not improve overall class performance on the first set of written and lab practical exams when compared to exam performance in years where formative practice exams were not given (p=0.84). However, significant correlations were found between practice lab exam scores and first lab exam grades (R2=0.342, p<0.01) as well as final grades (R2=0.372, p<0.01). Significant correlations were also found between practice written exam scores and first written exam grades (R2=0.308, p<0.01) as well as final grades (R2=0.341, p<0.01). Survey results show that more students felt they studied adequately in preparation for the first set of exams than for the practice exams (Lab Practical: 4.36 vs. 3.53; Written Exam: 4.32 vs. 3.21; p<0.01).ConclusionComparison of class grades based on the presence or absence of a set of practice exams illustrated that there was not an improvement in academic performance by taking a practice exam before a graded exam. However, early formative practice assessments can be used to predict future academic performance within a class. These practice assessments could be used to identify students likely to struggle throughout the gross anatomy course as early as one week into the learning process. Targeted interventions to help students adjust their study strategies could then be applied early in the course to improve academic performance.This abstract is from the Experimental Biology 2018 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.
IntroductionFirst year medical students at the Medical College of Georgia learn to conduct the physical exam during Physical Diagnosis (PD). The physical exam is taught head to toe, similar to a clinician's technique for evaluating patients with no specific complaints. Students review online handouts/videos in preparation for PD workshops. However, students struggled with content because the workshops were completed before relevant anatomy was taught in basic science modules. To address this concern, Surface Anatomy sessions were introduced prior to the PD workshops.Purpose and HypothesisThe purpose of this study was to assess the effectiveness of Surface Anatomy sessions in teaching students the relevance of surface anatomy in performing the physical exam. We hypothesized that these sessions would improve students' knowledge of surface anatomy relevant to the physical exam.MethodsSurface Anatomy sessions were integrated into PD addressing Vital Signs, HEENOT (Head, Ears, Eyes, Nose, Oral Cavity, Throat) exam, Cardiopulmonary exam, Abdominal and Genitourinary/Gynecological (GU/GYN) exams, and Neurologic/Musculoskeletal exams. At the beginning of the year, students (n=193) completed a pre‐test with 16 Hotspot questions, where students click an area on an image, to determine prior surface anatomy knowledge. During each session, students answered 3 Audience Response System (ARS) questions assessing knowledge acquisition. Students completed a post‐test with the same Hotspot and ARS questions to assess retention at the conclusion of the curriculum. Students' performance on pre‐ and post‐tests were analyzed using two‐tailed, paired t‐tests to determine the effectiveness of the Surface Anatomy curriculum in understanding and performing the physical exam.ResultsStudents performed significantly better on post‐test Hotspot questions after the Surface Anatomy curriculum compared to pre‐test questions (42% ± 16.8% vs. 21.4% ± 11.4%; p<0.05). Despite learning relevant anatomy in basic science modules, students were still unable to correctly answer most questions at the end of the Surface Anatomy curriculum. Scores on post‐test questions related to the Cardiopulmonary exam were surprisingly low (average 19% ± 11.8%), especially since that content was previously taught in a basic science module. Students performed better on post‐test questions addressing the Abdomen and GU/GYN exam (average 70% ± 29.6%), which was just covered in the basic science modules. Students performed significantly worse on ARS questions at the end of the Surface Anatomy curriculum compared to their performance on the same questions during the sessions (53.8% ± 21.7% vs. 88.5% ± 11.8%, p<0.05).ConclusionThese results suggest a significant lapse in knowledge retention over the course of the curriculum. Overall, students appear to have difficulty recognizing, applying, and retaining their knowledge of surface anatomy relevant to the physical exam despite these sessions. In the future, more opportunities are needed to help students integrate surface anatomy knowledge with physical diagnosis skills.This abstract is from the Experimental Biology 2019 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.