Interest in spatial ability has grown over the past few decades following the emergence of correlational evidence associating spatial aptitude with educational performance in the fields of science, technology, engineering, and mathematics. The research field at large and the anatomy education literature on this topic are mixed. In an attempt to generate consensus, a meta‐analysis was performed to objectively summarize the effects of spatial ability on anatomy assessment performance across multiple studies and populations. Relevant studies published within the past 50 years (1969–2019) were retrieved from eight databases. Study eligibility screening was followed by a full‐text review and data extraction. Use of the Mental Rotations Test (MRT) was required for study inclusion. Out of 2,450 screened records, 15 studies were meta‐analyzed. Seventy‐three percent of studies (11 of 15) were from the United States and Canada, and the majority (9 of 15) studied professional students. Across 15 studies and 1,245 participants, spatial ability was weakly associated with anatomy performance (rpooled = 0.240; CI at 95% = 0.09, 0.38; P = 0.002). Performance on spatial and relationship‐based assessments (i.e., practical assessments and drawing tasks) was correlated with spatial ability, while performance on assessments utilizing non‐spatial multiple‐choice items was not correlated with spatial ability. A significant sex difference was also observed, wherein males outperformed females on spatial ability tasks. Given the role of spatial reasoning in learning anatomy, educators are encouraged to consider curriculum delivery modifications and a comprehensive assessment strategy so as not to disadvantage individuals with low spatial ability.
Competition is a key element in many educational games and is often adopted by educators in an effort to motivate and excite their students. Yet, the use of academic competition in educational institutions remains the subject of much debate. Opponents argue that academic competition causes an increase in student anxiety and divides their attention. However, if the contexts of academic competition are defined, could the inclusion of a game-like competition in a university course be a viable and beneficial method of engaging students? Students (n = 67) were recruited from an undergraduate human anatomy course at Western University. Using a crossover design, students were exposed to a competitive tournament either at the time of their first term test or second term test. The anatomical knowledge of participating students was assessed prior to the start of the study using a baseline anatomy test. Following treatment with an online competitive anatomy tournament, student's term test grades and final course grades were analyzed. Both the second term test scores (F(2,64) = 3.743, P = 0.029) and overall course grades (F(2,64) = 3.356, P = 0.041) were found to be significantly different (P < 0.05) for individuals in the competitive group when compared to their non-competing peers. As suggested by the literature where organized competition in the classroom correlates to improved academic performance, this study uncovered significant results pertaining to increased academic performance resulting from participating in tournament-based competition. In light of these positive results, further exploration of the effects of academic competition on student performance across age brackets and disciplines is warranted.
Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists.
The COVID‐19 pandemic and subsequent social distancing protocols have accelerated the shift to online teaching across the globe. In Science, Technology, Engineering, and Mathematics (STEM) programs this means a shift from face‐to‐face laboratory instruction to self‐directed learning with e‐learning tools. Unfortunately, selecting and integrating an e‐learning tool into a curriculum can be daunting. This article highlights key questions and practical suggestions instructors should consider in choosing the most effective option for their course and learners.
Online educational technologies and e-learning tools are providing new opportunities for students to learn worldwide, and they continue to play an important role in anatomical sciences education. Yet, as we shift to teaching online, particularly within the anatomical sciences, it has become apparent that e-learning tool success is based on more than just user satisfaction and preliminary learning outcomes-rather it is a multidimensional construct that should be addressed from an integrated perspective. The efficiency, effectiveness and satisfaction with which a user can navigate an e-learning tool is known as usability, and represents a construct which we propose can be used to quantitatively evaluate e-learning tool success. To assess the usability of an e-learning tool, usability testing should be employed during the design and development phases (i.e., prior to its release to users) as well as during its delivery (i.e., following its release to users). However, both the commercial educational software industry and individual academic developers in the anatomical sciences have overlooked the added value of additional usability testing. Reducing learner frustration and anxiety during e-learning tool use is essential in ensuring e-learning tool success, and will require a commitment on the part of the developers to engage in usability testing during all stages of an e-learning tool's life cycle. Anat Sci Educ 10: 190-199. © 2016 American Association of Anatomists.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.