Students' scores on assessments play a vital role in course modifications, though their effectiveness relies on the quality of the interpretation of these scores. We adapt the notion of assessments as a change agent so that a well-developed rubric accompanied by intentionally designed instructor feedback can act as a tool to inform course improvement. In conjunction with work developing a standardized upper-division thermal physics assessment, this pilot work articulates a methodology to determine feedback for instructors to inform how well their courses support students in meeting learning goals. In this paper, we present an example using a task targeting the scientific practice of "using mathematics" to explicate this methodology. This work highlights the importance of assessment feedback to inform explicit course modifications in physics.
Science education literature has called for blending of scientific practices with conceptual knowledge in higher education. With this move, there must also come a shift in the ways we assess students. To date, most research-based assessments place their main focus on conceptual knowledge as opposed to scientific practices, which may in part be due to the difficult nature of assessing scientific practices. Additionally, most assessment items addressing scientific practices are in free-response formats, which require in-class administration and scoring by hand, which can be onerous. Coupled, multiple-response (CMR) items pose a unique opportunity for assessing scientific practices because they elicit student reasoning while also allowing for streamlined, automated scoring. Grounded in Evidence-Centered Design, in this paper, we present the first stages in developing a generalizable process for creating CMR items that address scientific practices. We illustrate this process through an example from upper-division thermal physics.
Research based assessments have a productive and storied history in PER. While useful for conducting research on student learning, their utility is limited for instructors interested in improving their own courses. We have developed a new assessment design process that leverages three-dimensional learning, evidence-centered design, and self-regulated learning to deliver actionable feedback to instructors about supporting their students' learning. We are using this approach to design the Thermal and Statistical Physics Assessment (TaSPA), which also allows instructors to choose learning goals that align with their teaching. Perhaps more importantly, this system will be completely automated when it is completed, making the assessment scalable with minimal burden on instructors and researchers. This work represents an advancement in how we assess physics learning at a large scale and how the PER community can better support physics instructors and students.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.