Background Providing formative feedback on student responses to complex, open‐ended problems is challenging as solutions vary in content and quality. Instructors need to interpret student work and give feedback with high potential to guide students to close the gap between actual and reference level performance. To understand instructor feedback and its impact, a framework for analyzing instructor feedback is necessary. Purpose (Hypothesis) The purposes of this study were to develop a framework to analyze instructor feedback on team responses to open‐ended mathematical modeling problems and to demonstrate the use of the framework in investigating formative assessment systems. Design/Method To develop the framework, formative feedback from graduate teaching assistants on first‐year engineering student team responses to the Just‐In Time Manufacturing Model‐Eliciting Activity (MEA) was analyzed. The framework in conjunction with a four dimensional MEA Rubric was then used to identify patterns in feedback. Results Feedback has both form and substance. Four forms of feedback were identified: comments not intended to prompt change; and questions, open suggestions, and direct suggestions intended to prompt change. The substance was tied to the criteria used to evaluate MEA responses. Results of applying the framework yielded two claims: (1) the rubric criteria influence the form and intent of feedback, and (2) the perceived quality of student work also impacts the form and intent of feedback. Conclusion The framework that emerged from this work is a useful tool for investigating instructor feedback in this context. Further, the framework is believed to be extendable to other learning contexts.
Background When engineers and students engage in mathematical modeling, the models they create are representational systems of real‐world problem situations. These representational systems reveal modelers' interpretations of the relative importance of various real‐world aspects. Purpose This paper illustrates how interpretations of student teams' solution models (i.e., their representational systems) to a particular problem can be used to inform educational decisions. Design/Method First‐year engineering teams' iterative solution models to the Travel Mode Choice Model‐Eliciting Activity (MEA) were interpreted for the purpose of identifying patterns within and across teams' responses. This MEA requires the creation of a model for a client who wants to predict the mode of transportation an individual will likely take to a college campus. Teams were provided with the client's needs and sample data concerning individuals' actual travel mode and travel time, cost, and convenience for various travel modes (walk, bus, or drive). Results Student teams were required to submit their intermediate and final model descriptions in narrative form. Systematic study of their models resulted in identification of four different types of models. Patterns within and across model types were then used to suggest changes to the instructional system. Conclusion When instructors study student models as representations of their interpretation of a real‐world problem situation, insights gained can lead to practical proposals for improving instruction with authentic open‐ended problems including revisions to tasks, implementation strategies, formative assessment strategies, and course content.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.