Assessments that aim to evaluate student understanding of chemical reactions and reaction mechanisms should ask students to construct written or oral explanations of mechanistic representations; students can reproduce pictorial mechanism representations with minimal understanding of the meaning of the representations. Grading such assessments is time-consuming, which is a limitation for use in large-enrollment courses and for timely feedback for students. Lexical analysis and logistic regression techniques can be used to evaluate student written responses in STEM courses. In this study, we use lexical analysis and logistic regression techniques to score a constructed-response item which aims to evaluate student explanations about what is happening in a unimolecular nucleophilic substitution (i.e., SN1) reaction and why. We identify three levels of student explanation sophistication (i.e., descriptive only, surface level why, and deeper why), and qualitatively describe student reasoning about four main aspects of the reaction: leaving group, carbocation, nucleophile and electrophile, and acid–base proton transfer. Responses scored as Level 1 (N = 113, 11%) include only a description of what is happening in the reaction and do not address the why for any of the four aspects. Level 2 responses (N = 549, 53%) describe why the reaction is occurring at a surface level (i.e., using solely explicit features or mentioning implicit features without deeper explanation) for at least one aspect of the reaction. Level 3 responses (N = 379, 36%) explain the why at a deeper level by inferring implicit features from explicit features explained using electronic effects for at least one reaction aspect. We evaluate the predictive accuracy of two binomial logistic regression models for scoring the responses with these levels, achieving 86.9% accuracy (with the testing data set) when compared to human coding. The lexical analysis methodology and emergent scoring framework could be used as a foundation from which to develop scoring models for a broader array of reaction mechanisms.
Academic Motivation Scale-Chemistry (AMS-Chemistry), an instrument based on the self-determination theory, was used to evaluate students’ motivation in two organic chemistry courses, where one course was primarily lecture-based and the other implemented flipped classroom and peer-led team learning (Flip–PLTL) pedagogies. Descriptive statistics showed that students in both courses were more extrinsically motivated and their motivation moved in negative directions across the semester. Factorial multivariate analysis of covariance revealed a main effect of pedagogical approach. Students in the Flip–PLTL environment were significantly more motivated toward chemistry at the end of the semester while controlling for the motivation pre-test scores; however, there was no evidence for a sex main effect or an interaction effect between sex and pedagogical approach. Correlation results revealed variable relationships between motivation subscales and academic achievement at different time points. In general, intrinsic motivation subscales were significantly and positively correlated with student academic achievement;Amotivationwas negatively correlated with academic achievement. The findings in this study showed the importance of Flip–PLTL pedagogies in improving student motivation toward chemistry.
As a way to assist chemistry departments with programmatic assessment of undergraduate chemistry curricula, the ACS Examinations Institute is devising a map of the content taught throughout the undergraduate curriculum. The structure of the map is hierarchal, with large grain size at the top and more content detail as one moves "down" the levels of the map, of which there are four levels total. This paper presents these four levels of the map with reference to second-year, organic chemistry.
While research on and development of evidence‐based instructional practices (EBIPs) in STEM education has flourished, implementation of these practices classrooms has not been as prolific. Using the teacher‐centered systemic reform model as a framework, we explore the connection between chemistry instructors’ beliefs about teaching and learning and self‐efficacy beliefs, and their enacted classroom practices. Postsecondary chemistry faculty present a unique population for the study because of their role in teaching prerequisite courses, such as general and organic chemistry, which are key to many science major fields. A measure of teacher beliefs and self‐efficacy was administered to a national survey of postsecondary chemistry faculty members. Instructional practices used in a chemistry course were also collected via self‐report. While instructional practices were not directly observed, a cluster analysis of our data mirrors patterns of instructional practices found in observation‐based studies of chemistry faculty. Significant differences are found on teacher thinking and self‐efficacy measures based on enacted instructional practices. Results support the hypothesized connection between beliefs and instructional practice on a larger scale than in previous studies of this relationship, bolstering the evidence for the importance of this relationship over previously criticized results. These results present a call for reform efforts on fostering change from its core, that is, the beliefs of those who ultimately adopt EBIPs. Dissemination and design should incorporate training and materials that highlight the process by which faculty members interpret reformed practices within their belief system, and explore belief change in the complex context of education reform.
A national survey of inorganic chemists explored the self-reported topics covered in foundation-level courses in inorganic chemistry at the postsecondary level; the American Chemical Society's Committee on Professional Training defines a foundation course as one at the conclusion of which, "a student should have mastered the vocabulary, concepts, and skills required to pursue in-depth study in that area." Anecdotal evidence suggested that more than one type of Inorganic Chemistry Foundation course was offered in the undergraduate chemistry curriculum. Cluster analysis confirmed this evidence, revealing four distinct foundation courses, each with unique profiles of topics covered. Faculty reported changes in content coverage over the past five years that mirror the evolving foci of inorganic chemistry research. These results potentially complicate how graduate programs evaluate incoming students' understanding of inorganic chemistry and the design of national assessments of undergraduate inorganic chemistry courses.
The Lewis acid−base model is key to identifying and explaining the formation and breaking of bonds in a large number of reaction mechanisms taught in the sophomore-level year-long organic chemistry course. Understanding the model is, thus, essential to success in organic chemistry coursework. Concept-inventories exist to identify misunderstandings and misconceptions of acid−base theories; open-ended problems, though, have been shown to provide a more nuanced and holistic understanding of how students use acid−base models to explain reactions. The time necessary to score such problems, however, limits their use, especially in large student enrollment courses. Given the efficacy of open-ended problems, there is occasion for the development of methods to efficiently and effectively analyze open-ended assessment responses. In this study, we establish the importance of assessing "use of the Lewis acid−base model to explain a chemical reaction" by determining the association of model use with summative examination performance. In addition, we generate and evaluate a binomial logistic regression model based on lexical analysis techniques for predicting Lewis acid−base model use in explanations of an acid−base proton-transfer reaction. Our work results in a predictive model that can be used to score the open-ended problem used in our study.
Standardized examinations, such as those developed and disseminated by the ACS Examinations Institute, are artifacts of the teaching of a course and over time may provide a historical perspective on how curricula have changed and evolved. This study investigated changes in organic chemistry curricula across a 60-year period by evaluating 18 ACS Organic Chemistry Exams through the lenses of problem-type, visualization use, content covered, and percentile rankings. For all lenses, the early 1970s emerged as a focal point for change and stabilization of the organic chemistry curricula. ABSTRACT: Standardized examinations, such as those developed and disseminated by the ACS Examinations Institute, are artifacts of the teaching of a course and over time may provide a historical perspective on how curricula have changed and evolved. This study investigated changes in organic chemistry curricula across a 60-year period by evaluating 18 ACS Organic Chemistry Exams through the lenses of problem-type, visualization use, content covered, and percentile rankings. For all lenses, the early 1970s emerged as a focal point for change and stabilization of the organic chemistry curricula.
A national survey of inorganic chemists explored the self-reported topics covered in in-depth inorganic chemistry courses at the postsecondary level; an in-depth course is defined by the American Chemical Society's Committee on Professional Training as a course that integrates and covers topics that were introduced in introductory and foundation courses in a more thorough manner. Anecdotal evidence suggested that more than one type of in-depth course was offered in the undergraduate chemistry curriculum. Cluster analysis confirmed this evidence and revealed three distinct types of in-depth inorganic chemistry courses with unique topical profiles. These results confirm diversity in the inorganic chemistry curriculum and the need for awareness that our students leave degree programs with varying understanding of inorganic chemistry based on the coursework offered at their respective institutions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.