The forced move to online learning
in the arrival and persistence
of the COVID-19 pandemic underscored the present lack of models for
fully online general chemistry laboratory courses. While there exist
a fair number of simulation platforms, video libraries, and one-off
virtual experiments, a lack of complete general chemistry lab online
courses necessitated the development of such an experience. By leveraging
freely available simulations and videos, we were able to design two
synchronous online delivery lab courses in the summer of 2020. Herein,
the courses are described with accompanying analysis probing student
perceptions of their experiences.
Five years of longitudinal data for general chemistry student assessments at the University of Georgia have been analyzed using item response theory (IRT). Our analysis indicates that minor changes in question wording on exams can make significant differences in student performance on assessment questions. This analysis encompasses data from over 6100 students, giving an extremely small statistical uncertainty. IRT provided us with a new insight into student performance on our assessments that is also important to the chemical education community. In this paper, IRT, in conjunction with computerized testing, indicates how nuances in question wording impact student performance on assessments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.