The performance of over 5000 students in introductory calculus-based mechanics courses at the Georgia Institute of Technology was assessed using the Force Concept Inventory (FCI). Results from two different curricula were compared: a traditional mechanics curriculum and the Matter & Interactions (M&I) curriculum. Post-instruction FCI averages were significantly higher for the traditional curriculum than for the M&I curriculum; the differences between curricula persist after accounting for factors such as pre-instruction FCI scores, grade point averages, and SAT scores. FCI performance on categories of items organized by concepts was also compared; traditional averages were significantly higher in each concept. We examined differences in student preparation between the curricula and found that the relative fraction of homework and lecture topics devoted to FCI force and motion concepts correlated with the observed performance differences. Limitations of concept inventories as instruments for evaluating curricular reforms are discussed.Each year more than 35% of American college and university students enroll in a physics course. 1 Only a small fraction of these students ultimately complete a degree in physics; the vast majority pursue a degree in engineering or another science. 2 Many are students in an introductory physics course; approximately 175,000 students each year enroll in introductory calculus-based physics. 3 However, many of these students fail to acquire an effective understanding of concepts, principles, and methods from these introductory courses. Rates of failure and withdrawal from these courses are often high and substantial research into this subject has shown that students' misconceptions in physics persist after instruction. 4,5 This paper describes an attempt to evaluate, using a multiple-choice concept inventory, 6 a reformed introductory mechanics curriculum 7 which aims to mitigate these issues by altering the goals and content (i.e., the curriculum) of the typical mechanics course.To help improve student learning in physics, many new methods of content delivery (pedagogy) have been developed in recent years. Typically, these methods have been implemented with little change to course curricula. Well established pedagogical modifications now used widely include tutorials, 8 clicker questions, 9 peer instruction, 10 Socratic tutorial homework systems, 11 multiple representations of concepts and principles, 12 and reconfigurations of the instructional environment. 13 There is ample evidence that students who experience these pedagogical reforms perform better on end-of-course concept inventories than students in passive lecture courses. Concept inventories are useful tools to make such comparisons in these cases where all courses (with and without pedagogical reform) share, for the most part, the same core content and goals.By contrast, there is sparse research on how student learning is affected by substantial alterations to the goals and content (curriculum) of introductory physics cours...
The advent of new educational technologies has stimulated interest in using online videos to deliver content in university courses. We examined student engagement with 78 online videos that we created and were incorporated into a one-semester flipped introductory mechanics course at the Georgia Institute of Technology. We found that students were more engaged with videos that supported laboratory activities than with videos that presented lecture content. In particular, the percentage of students accessing laboratory videos was consistently greater than 80% throughout the semester. On the other hand, the percentage of students accessing lecture videos dropped to less than 40% by the end of the term. Moreover, the fraction of students accessing the entirety of a video decreases when videos become longer in length, and this trend is more prominent for the lecture videos than the laboratory videos. The results suggest that students may access videos based on perceived value: students appear to consider the laboratory videos as essential for successfully completing the laboratories while they appear to consider the lecture videos as something more akin to supplemental material. In this study, we also found that there was little correlation between student engagement with the videos and their incoming background. There was also little correlation found between student engagement with the videos and their performance in the course. An examination of the in-video content suggests that students engaged more with concrete information that is explicitly required for assignment completion (e.g., actions required to complete laboratory work, or formulas or mathematical expressions needed to solve particular problems) and less with content that is considered more conceptual in nature. It was also found that students' in-video accesses usually increased toward the embedded interaction points. However, students did not necessarily access the follow-up discussion of these interaction points. The results of the study suggest ways in which instructors may revise courses to better support student learning. For example, external intervention that helps students see the value of accessing videos may be required in order for this resource to be put to more effective use. In addition, students may benefit more from a clicker question that reiterates important concepts within the question itself, rather than a clicker question that leaves some important concepts to be addressed only in the discussion afterwards.
Abstract. As part of a larger research project into massively open online courses (MOOCs), we have investigated student background, as well as student participation in a physics MOOC with a laboratory component. Students completed a demographic survey and the Force and Motion Conceptual Evaluation at the beginning of the course. While the course was still actively running, we tracked student participation over the first five weeks of the eleven-week course.
Abstract. The Georgia Tech blended introductory calculus-based mechanics course emphasizes scientific communication as one of its learning goals, and to that end, we gave our students a series of four peer-evaluation assignments intended to develop their abilities to present and evaluate scientific arguments. Within these assignments, we also assessed students' evaluation abilities by comparing their evaluations to a set of expert evaluations. We summarize our development efforts and describe the changes we observed in student evaluation behavior.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.