Introduction Poor retention of medical knowledge is a concern within medical education. While studies show that student retention from basic science disciplines often follows the “forgetting curve”, histology retention has not been examined independently of other anatomical sciences. Investigation of histology retention is of increasing importance as medical education moves towards integrated curricula with the use of technological advances in the classroom, such as virtual microscopy. The purpose of this study was to evaluate histology retention of first‐year students at the Medical College of Georgia (MCG). Aims The specific aims of this study include evaluating the relationships of 1) histology retention and academic performance 2) retention intervals (RI) to histology retention and 3) histology retention and students’ previous exposure to histology as well as their modality of study. Methods Academic performance data from histology quizzes and exams were collected from first‐year medical students at MCG from the Class of 2022 (n=171). A histology comprehensive assessment was administered at the end of the academic year to assess histology knowledge retained throughout first‐year histology curriculum. Students were also surveyed on their prior histology experience and study method modality. A linear regression analysis was performed to determine if there was a correlation between academic performance and retention. A comparison of means was used to assess the relationship between histology retention scores compared to academic performance in terms of RI, histology exposure, and modality of study. Paired sample t‐tests were used for analyses. IRB approval (exempt) was obtained from Augusta University. Results First‐year medical students at MCG were found to only retain 52.4% ± 17.0% of histology content on the end of year comprehensive assessment. Academic performance in histology did not predict retention at the end of the academic year (R=0.27). Student retention dropped on average from 84% to 52% regardless of RI length (2, 3, 5, or 6 months). No significant difference was found between students with prior histology exposure (85.9% ± 4.9%) and those without (84.2% ± 5.3%) on overall histology grade averages. However, those with prior histology experience did score significantly better on the comprehensive assessment (58.5% ± 15.3% vs. 51.2% ± 17.2%; p=0.04). No significant difference was seen on average histology grades (84.5% ± 4.2%, 84.2% ± 16.6%) or comprehensive assessment performance (52.9% ± 6.9%, 51.4% ± 17.8%) when study modalities (virtual microscopy vs physical slides) were compared. Discussion and Conclusions This data supports previously reported findings that medical students retain on average ~50% of their basic science knowledge. These findings also demonstrate that academic performance is not a predictor of retention. Furthermore, RI and method of study appear to have no significant impact on histology retention. However, prior histology experience appears to aid in retention and suggest that...
Introduction Medical students often experience some level of anxiety during their training, particularly concerning exams. In addition to tests within the medical curriculum, students and residents must pass high stakes United States Medical Licensing Examination (USMLE) standardized exams to earn their degree and for licensure. Test anxiety can severely impact a person’s ability to perform. Identification of those who struggle with test anxiety early in their training may be useful in implementing targeted interventions that will be beneficial beyond graduation from medical school. The Westside Test Anxiety Scale (WTAS) is an assessment that measures performance impairment and worry as these factors relate to exams. Aim The aim of this study was to examine how test anxiety, as determined by the WTAS assessment, contributes to academic performance of first year medical students. Based on previous studies, we hypothesized that overall test anxiety would negatively correlate with academic performance. Methods At the Medical College of Georgia, medical students (n =131 out of 191) completed a preclinical anxiety questionnaire at the end of the first year curriculum. The survey consisted of the 10‐question WTAS along with Likert scale and narrative response questions assessing students’ experiences with anxiety in the curriculum. Students were subdivided into tiers of low anxiety (WTAS range: 1.0–1.9; n=33), normal anxiety (WTAS range: 2.0–2.9; n=75), and high anxiety (WTAS range: 3.0–5.0; n=23). Academic performance from multiple choice quizzes and exams, fill‐in‐the‐blank lab practicals, and overall weighted first year average were compared between low, normal, and high anxiety cohorts using one‐way ANOVA and Tukey’s post‐hoc analysis. IRB approval was obtained from Augusta University. Results When subdivided into tiers of low anxiety (avg. WTAS score 1.6 ± 0.3), normal anxiety (avg. WTAS score 2.5 ± 0.3), and high anxiety (avg. WTAS score 3.4 ± 0.4), students with high anxiety had a significantly lower overall grade (80.5 ± 5.5) than students with normal (84.1 ± 4.9) or low anxiety (85.5 ± 4.2) (p<0.05). Anatomy, histology, and neuroanatomy lab practical averages were not significantly different among students in the three WTAS tiers. However, performance on gross anatomy and histology multiple choice questions on quizzes (Q) and module exams (ME) were significantly lower for students with high anxiety (Q: 77.5 ± 7.5; ME: 78.1 ± 8.2) than students with low (Q: 83.4 ± 6.4; ME: 85.9 ± 5.2) or normal (Q: 82.4 ± 8.0; ME: 84.7 ± 6.9) anxiety (p<0.05). Conclusions This preliminary study found that increased test anxiety has a negative association with academic performance in the first year curriculum. Test anxiety negatively impacts performance on gross anatomy and histology written multiple choice exam questions but not fill‐in‐the‐blank timed station lab practicals, suggesting that modality of question and/or exam type may negate the effect of anxiety on exam performance. Future studies will examine whether other fact...
Introduction Medical imaging is the primary approach to visualize normal and pathological anatomy in clinical practice and is used in almost every specialty. As such, it is imperative that medical students learn how to view and interpret radiological images early in training. At the Medical College of Georgia, medical imaging is integrated into first year anatomy lectures and labs and tested on written exams and lab practicals. Aim The aim of this study was to determine student study strategies for radiology and evaluate their effectiveness. We hypothesized that students utilizing active learning strategies would be more effective in learning and retaining radiology content. Methods Following completion of the first year curriculum, medical students (n=140 out of 191) completed a survey to ascertain radiology study strategies. Students also completed a radiology formative assessment that included 10 questions from previous lab practicals. Data from anatomy practicals throughout the year were analyzed. Students were split into quartiles (n=35) based on overall Anatomy grade (1st quartile: 88.6 ± 2.5; 2nd quartile: 82.7 ± 1.2; 3rd quartile: 78.7 ± 1.2; 4th quartile: 72.7 ± 3.7). One‐way ANOVA with Tukey’s post‐hoc analysis and unequal variance two tailed t‐tests were used to compare study strategies and grades among quartiles. IRB approval was obtained from Augusta University. Results The most commonly cited study strategies included attending faculty reviews (70%), individual studying (64%), using pre‐labeled 2D images instead of 3D image stacks (55%), and using radiology websites (51%). Students scored significantly lower on the end of year questions (38.1 ± 2.1) compared to performance on lab practicals during the year (86.2 ± 1.2) (p<0.05). Students in the 1st and 2nd quartiles scored significantly higher on select radiology items from lab practicals throughout the year compared to the 4th quartile (1st: 92.6 ± 9.4, 2nd: 91.1 ± 6.7, 4th: 75.7 ± 14.0; p<0.05). End of year radiology assessment scores were also significantly higher in the 1st and 2nd quartiles compared to the 4th quartile (1st: 50.0 ± 19.6, 2nd: 42.6 ± 17.3, 4th: 28.0 ± 16.2; p<0.05). Students in the 1st quartile used active learning study strategies such as 3D images stacks (40% ± 4.9%) and practice questions (51.4% ± 5.0%) significantly more often than students in the 4th quartile (11.4% ± 3.2%; 20.0% ± 4.0% respectively) (p<0.05). Conclusions These data show that current study strategies used by students do not promote long term retention of radiology content; however, those students using active learning study strategies retain more. Although top quartile students scored significantly better on the end of year radiology assessment, they only retained ~50% of radiology material. Based on the results of this study, the radiology curriculum has been revised to address concerns about long term retention. These changes include more frequent exposure to radiological images and more formative assessments that require students to scroll through 3D ima...
IntroductionEnsuring that medical students are challenged with critical thinking problems and can solve them successfully is important to their development as future physicians. Multiple‐choice questions (MCQs), if written correctly, can promote the development of these critical thinking skills. Bloom's taxonomy is a framework used to classify cognitive skills into increasingly complex levels of learning (Remember, Understand, Apply, Analyze, Evaluate, and Create). Adapted versions of Bloom's taxonomy have been used to categorize MCQs into lower order, non‐critical thinking questions and higher‐order, critical thinking questions.Purpose and HypothesisThe purpose of this study was to evaluate the distribution of Gross Anatomy and Development MCQs categorized as either non‐critical thinking questions (Remember or Understand) or critical thinking questions (Apply/Analyze) and analyze student performance on those questions. We hypothesized that students would perform better on non‐critical thinking MCQs compared to critical thinking MCQs.MethodsAt the Medical College of Georgia, the first year medical curriculum consists of systems‐based modular blocks composed of basic science components including Gross Anatomy and Development. Academic performance is measured primarily from MCQ exams for each module. Gross Anatomy and Development MCQs (n=260) from the 2016–2017 academic year were evaluated and sorted into Remember, Understand, or Apply/Analyze categories by four evaluators. Interrater reliability was calculated using Krippendorff's α (α=0.54). Student academic performance (n=192 students) on categorized MCQs was analyzed and compared using a one‐way ANOVA and Tukey's post‐hoc analysis.ResultsOf the 151 Anatomy questions, 20.5% were categorized as Remember, 51% as Understand, and 28.5% as Apply/Analyze. Students scored similarly on all three categories of Anatomy MCQs (Remember: 77.3% ± 18%; Understand: 82.3% ± 13%; Apply/Analyze: 78.7% ± 16%; p=0.202). Of the 109 Development questions, 33% were categorized as Remember, 39.4% as Understand, and 27.5% as Apply/Analyze. Students performed significantly better on Development MCQs categorized as Understand compared to Remember (81.1% ± 11.5% vs. 72.4% ± 15.5%; p=0.015). There was no difference between Understand and Apply/Analyze MCQs (74.6% ± 13.2%; p=0.109) or Remember and Apply/Analyze MCQs (p=0.799).ConclusionStudents performed similarly across all categories of Anatomy MCQs. However, for Development, students performed significantly better on Understand MCQs compared to Remember MCQs. Though Remember questions are one‐step and straightforward, they may be difficult if they involve specific isolated facts, unlike broader concepts tested by Understand questions. Apply/Analyze questions involve multiple steps or require students to connect several pieces of information to conclude the correct answer, which may make them more difficult for students. Overall, fewer MCQs were categorized as critical thinking than expected, as it was initially thought that the majority of MCQs in the curriculum would involve higher‐order reasoning skills. However, classification of MCQs by evaluators was challenging because students may approach questions differently based on prior knowledge and study resources used. This study identified opportunities to improve assessments by incorporating more MCQs that test higher order critical thinking skills.This abstract is from the Experimental Biology 2018 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.
Introduction Metacognition refers to awareness of one's own knowledge and is fundamental for physicians to have. It encompasses how one plans when approaching a learning task, monitors comprehension, and reflects on understanding and performance. As medicine evolves, it is critical that medical students and physicians have insight into their limitations and knowledge deficiencies. Diagnostic and treatment errors are one of the leading causes of death, and overconfidence, or a lack of metacognition, is often to blame. Metacognitive skills must be developed not only to achieve higher academic performance, but also to link medical knowledge to clinical thinking, emphasize critical thinking, and enhance reflection. Aims The purpose of this study was to examine metacognitive skills of first year medical students and evaluate the influence these skills have on academic performance. We hypothesized that there would be a significant relationship between higher metacognition and academic performance. Methods At the end of the first year, Medical College of Georgia students (n=119 out of 191) completed a survey consisting of the Metacognitive Awareness Inventory (MAI), which assesses knowledge and regulation about cognition (possible score: 61–305), and questions about study strategies. Students were divided into tertiles based on metacognition results (low: 160.8 ± 13.1, n=41; middle: 185.5 ± 5.6, n=39; high: 214.7 ± 15.7, n=39). One‐way ANOVA with Tukey’s posthoc analysis was used to compare metacognition scores with academic performance in basic science components (Anatomy, Biochemistry, Development, Histology, Physiology, Neuroscience), and systems‐based modules (Cell & Molecular Basis of Medicine, Tissue/Musculoskeletal, Cardiopulmonary, Gastrointestinal‐Nutrition, Genitourinary, Head/Neck & Special Senses, Medical Neuroscience & Behavioral Health), as well as the frequency of study strategies used. IRB approval was obtained from Augusta University. Results Students with high metacognition scores performed significantly better in the curriculum (overall grade: 86.5 ± 5.1) than the middle (overall grade: 83.6 ± 4.8) and low tertiles (overall grade: 81.2 ± 5.0) (p<0.05). The highest metacognitive tertile scored significantly higher in every basic science component compared to the lowest tertile (p<0.05) and significantly better than the middle tertile for all of the anatomical science components (p<0.05). Medical students in the highest metacognitive tertile performed significantly higher than the lowest tertile in every systems‐based module except the first, Cell & Molecular Basis of Medicine (p<0.05). The highest metacognitive tertile was significantly more likely to peer teach and take notes from memory than the lowest tertile (p<0.05). Additionally, a significant difference in frequency of reviewing at spaced intervals existed between all tertiles (p<0.05). Conclusions This preliminary study found that higher metacognition positively impacts academic performance and should be actively fostered. Medical students woul...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.