Practice exams are a type of deliberate practice that have been shown to improve student course performance. Deliberate practice differs from other types of practice, because it is targeted, mentally challenging, can be repeated, and requires feedback. Providing frequent instructor feedback to students, particularly in large classes, can be prohibitive. A possible solution is to have students grade practice exams using an instructor-generated rubric, receiving points only for completion. Students can either grade their own or a peer’s work. We investigated whether peer or self-grading had a differential impact on completion of practice exam assignments, performance on practice exams or course exams, or student grading accuracy. We also investigated whether student characteristics mattered. We found that 90% of students took all practice exams or only missed one and that there was no difference on practice or course exam performance between the peer and self-graders. However, in the peer-grading treatment, students with lower incoming grade point averages and students identified as economically or educationally disadvantaged were less accurate and more lenient graders than other students. As there is no clear benefit of peer grading over self-grading, we suggest that either format can solve the challenge instructors face in giving frequent personalized feedback to many students.
Evidence-based teaching practices are associated with improved student academic performance. However, these practices encompass a wide range of activities and determining which type, intensity or duration of activity is effective at improving student exam performance has been elusive. To address this shortcoming, we used a previously validated classroom observation tool, Practical Observation Rubric to Assess Active Learning (PORTAAL) to measure the presence, intensity, and duration of evidence-based teaching practices in a retrospective study of upper and lower division biology courses. We determined the cognitive challenge of exams by categorizing all exam questions obtained from the courses using Bloom’s Taxonomy of Cognitive Domains. We used structural equation modeling to correlate the PORTAAL practices with exam performance while controlling for cognitive challenge of exams, students’ GPA at start of the term, and students’ demographic factors. Small group activities, randomly calling on students or groups to answer questions, explaining alternative answers, and total time students were thinking, working with others or answering questions had positive correlations with exam performance. On exams at higher Bloom’s levels, students explaining the reasoning underlying their answers, students working alone, and receiving positive feedback from the instructor also correlated with increased exam performance. Our study is the first to demonstrate a correlation between the intensity or duration of evidence-based PORTAAL practices and student exam performance while controlling for Bloom’s level of exams, as well as looking more specifically at which practices correlate with performance on exams at low and high Bloom’s levels. This level of detail will provide valuable insights for faculty as they prioritize changes to their teaching. As we found that multiple PORTAAL practices had a positive association with exam performance, it may be encouraging for instructors to realize that there are many ways to benefit students’ learning by incorporating these evidence-based teaching practices.
Background There is overwhelming evidence that evidence-based teaching improves student performance; however, traditional lecture predominates in STEM courses. To provide support as faculty transform their lecture-based classrooms with evidence-based teaching practices, we created a faculty development program based on best practices, Consortium for the Advancement of Undergraduate STEM Education (CAUSE). CAUSE paired exploration of evidence-based teaching with support for classroom implementation over two years. Each year for three years, CAUSE recruited cohorts of faculty from seven STEM departments. Faculty met biweekly to discuss evidence-based teaching and receive feedback on their implementation. We used the PORTAAL observation tool to document evidence-based teaching practices (PORTAAL practices) across four randomly chosen class sessions each term. We investigated if the number of PORTAAL practices used or the amount of practices increased during the program. Results We identified identical or equivalent course offerings taught at least twice by the same faculty member while in CAUSE (n = 42 course pairs). We used a one-way repeated measures within-subjects multivariate analysis to examine the changes in average use of 14 PORTAAL practices between the first and second timepoint. We created heat maps to visualize the difference in number of practices used and changes in level of implementation of each PORTAAL practice. Post-hoc within-subjects effects indicated that three PORTAAL practices were significantly higher and two were lower at timepoint two. Use of prompting prior knowledge and calling on volunteers to give answers decreased, while instructors doubled use of prompting students to explain their logic, and increased use of random call by almost 40% when seeking answers from students. Heat maps indicated increases came both from faculty’s adoption of these practices and increased use, depending on the practice. Overall, faculty used more practices more frequently, which contributed to a 17% increase in time that students were actively engaged in class. Conclusions Results suggest that participation in a long-term faculty development program can support increased use of evidence-based teaching practices which have been shown to improve student exam performance. Our findings can help prioritize the efforts of future faculty development programs.
In an endeavour to provide a handy guide to available materials for secondary-aged students, a two-part review has been collated. It is anticipated that the following specific needs will be met.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.