Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.
The advent of new educational technologies has stimulated interest in using online videos to deliver content in university courses. We examined student engagement with 78 online videos that we created and were incorporated into a one-semester flipped introductory mechanics course at the Georgia Institute of Technology. We found that students were more engaged with videos that supported laboratory activities than with videos that presented lecture content. In particular, the percentage of students accessing laboratory videos was consistently greater than 80% throughout the semester. On the other hand, the percentage of students accessing lecture videos dropped to less than 40% by the end of the term. Moreover, the fraction of students accessing the entirety of a video decreases when videos become longer in length, and this trend is more prominent for the lecture videos than the laboratory videos. The results suggest that students may access videos based on perceived value: students appear to consider the laboratory videos as essential for successfully completing the laboratories while they appear to consider the lecture videos as something more akin to supplemental material. In this study, we also found that there was little correlation between student engagement with the videos and their incoming background. There was also little correlation found between student engagement with the videos and their performance in the course. An examination of the in-video content suggests that students engaged more with concrete information that is explicitly required for assignment completion (e.g., actions required to complete laboratory work, or formulas or mathematical expressions needed to solve particular problems) and less with content that is considered more conceptual in nature. It was also found that students' in-video accesses usually increased toward the embedded interaction points. However, students did not necessarily access the follow-up discussion of these interaction points. The results of the study suggest ways in which instructors may revise courses to better support student learning. For example, external intervention that helps students see the value of accessing videos may be required in order for this resource to be put to more effective use. In addition, students may benefit more from a clicker question that reiterates important concepts within the question itself, rather than a clicker question that leaves some important concepts to be addressed only in the discussion afterwards.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.