Instructor Talk—noncontent language used by instructors in classrooms—is a recently defined and promising variable for better understanding classroom dynamics. Having previously characterized the Instructor Talk framework within the context of a single course, we present here our results surrounding the applicability of the Instructor Talk framework to noncontent language used by instructors in novel course contexts. We analyzed Instructor Talk in eight additional biology courses in their entirety and in 61 biology courses using an emergent sampling strategy. We observed widespread use of Instructor Talk with variation in the amount and category type used. The vast majority of Instructor Talk could be characterized using the originally published Instructor Talk framework, suggesting the robustness of this framework. Additionally, a new form of Instructor Talk—Negatively Phrased Instructor Talk, language that may discourage students or distract from the learning process—was detected in these novel course contexts. Finally, the emergent sampling strategy described here may allow investigation of Instructor Talk in even larger numbers of courses across institutions and disciplines. Given its widespread use, potential influence on students in learning environments, and ability to be sampled, Instructor Talk may be a key variable to consider in future research on teaching and learning in higher education.
Active-learning pedagogies have been repeatedly demonstrated to produce superior learning gains with large effect sizes compared with lecture-based pedagogies. Shifting large numbers of college science, technology, engineering, and mathematics (STEM) faculty to include any active learning in their teaching may retain and more effectively educate far more students than having a few faculty completely transform their teaching, but the extent to which STEM faculty are changing their teaching methods is unclear. Here, we describe the development and application of the machine-learning-derived algorithm Decibel Analysis for Research in Teaching (DART), which can analyze thousands of hours of STEM course audio recordings quickly, with minimal costs, and without need for human observers. DART analyzes the volume and variance of classroom recordings to predict the quantity of time spent on single voice (e.g., lecture), multiple voice (e.g., pair discussion), and no voice (e.g., clicker question thinking) activities. Applying DART to 1,486 recordings of class sessions from 67 courses, a total of 1,720 h of audio, revealed varied patterns of lecture (single voice) and nonlecture activity (multiple and no voice) use. We also found that there was significantly more use of multiple and no voice strategies in courses for STEM majors compared with courses for non-STEM majors, indicating that DART can be used to compare teaching strategies in different types of courses. Therefore, DART has the potential to systematically inventory the presence of active learning with ∼90% accuracy across thousands of courses in diverse settings with minimal effort.active learning | evidence-based teaching | science education | lecture | assessment C urrent college STEM (science, technology, engineering, and mathematics) teaching in the United States continues to be lecture-based and is relatively ineffective in promoting learning (1, 2). Undergraduate instructors continue to struggle to engage, effectively teach, and retain postsecondary students, both generally and particularly among women and students of color (3, 4). Federal analyses suggest that a 10% increase in retention of undergraduate STEM students could address anticipated STEM workforce shortfalls (5). Replacing the standard lecture format with more active teaching strategies has been shown to increase
The traditional format for neuroanatomy lab practical exams involves stations with a time limit for each station and inability to revisit stations. Timed exams have been associated with anxiety, which can lead to poor performance. In alignment with the universal design for learning (UDL), Timed Image Question and Untimed Image Question exam formats were designed to determine which format supports student success, especially for those who performed poorly in the traditional format. Only the Untimed Image Question format allowed students to revisit questions. All three formats were administered in a randomized order within a course for three cohorts of medical students. When all students' scores were analyzed together, the type of format had no effect. However, when analyses were conducted only on students who performed poorly in the traditional format, the type of format had an effect. These students increased their score, on average, by at least one grade level in the Untimed Image Question format compared to the traditional format. Students who performed well in the traditional format maintained their A, on average, in the two new formats. More students indicated Untimed Image Question as their most preferred format after experiencing all three formats. Most students associated the inability to revisit questions with high levels of anxiety. A neuroanatomy lab exam format was therefore identified as consistent with the UDL framework such that all students, regardless of test anxiety levels, can equally demonstrate what they learned. This format allowed for unlimited time per question and ability to revisit questions.
Multimodal approaches in teaching anatomy have been shown to improve student performance (Johnson, 2012). Most studies regarding best practices using technology to supplement teaching and learning in anatomy have been conducted in medical and undergraduate settings (Chakraborty, 2018; Sugand, 2010; Tam, 2020). There is little data that outlines the best practices for implementing iPads into occupational therapy (OT) anatomy courses (Meyer, 2016). Here, we aim to determine best practices for the implementation and student preferences of iPads for first year OT students in an anatomy laboratory at Samuel Merritt University (SMU). Our objective is to formulate a curriculum template on how to implement iPads in gross anatomy laboratory experiences. This study is significant as it will add to the body of literature that is lacking on the best practices for implementing iPads into an OT anatomy course and will establish best practices for enhancing student engagement in the anatomy laboratory. Best practices were documented for the implementation of thirteen iPads and eBooks into an OT anatomy lab course that met twice a week. OT students completed pre‐and post‐ surveys that included five‐point Likert scale and free text questions. The survey asked students to rate their comfort level and preferences for learning with iPads. Survey results displayed that 75% of students were comfortable using iPads, 37% of students strongly agreed that iPad applications improved their grade in the lab, and 43.75% believed that iPad applications helped with their understanding of cadavers. A working document on the best practices for implementing iPads into an anatomy laboratory space was also created. Best practices for implementing iPads and eBooks into an anatomy laboratory include setting up an orientation to the iPads and eBooks, assigning page numbers for eBooks, providing students with instructions on how to utilize the iPads outside of the eBooks, and how to use Poll Everywhere quizzes during laboratory sessions. The survey results display that OT students agree that iPad applications are of benefit to their understanding of cadaveric anatomy. This study also supports the need for a document that outlines the best practices for implementing iPads into an anatomy lab, specifically for OT students. Our results indicate that iPad technology aids students in learning cadaveric anatomy and can be incorporated as a supplemental resource to teaching cadaver‐based anatomy for OT students. Initial implementation of the iPads into the anatomy laboratory displayed a need for a document that outlined best practices that is absent from the literature. A working document for best practices can be disseminated to other anatomy faculty for use in their own anatomy laboratory courses. Future work and next steps include determining how students’ performance is impacted by the eBooks via tracking Poll Everywhere quiz scores and how students are actively using eBooks in the anatomy laboratory space and off‐campus by tracking eBook usage.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.