In this paper we address the following question: How can instructors leverage assessment instruments in design, build, and test courses to simultaneously improve student outcomes and assess student learning well enough to improve courses for future students? A learning statement is a structured text-based construct for students to record what they learned by reflecting on authentic immersive experiences in a semester-long engineering design course. The immersive experiences include lectures, assignments, reviews, building, testing, and a post-analysis of an electro-mechanical device to address a given customer need. Over the past three years, in the School of Aerospace and Mechanical Engineering at the University of Oklahoma, Norman, we have collected almost 30,000 learning statements from almost 400 students. In the past few years, we have analyzed this data to improve our understanding of what students have learned by reflecting on doing and thence how we might improve the delivery of the course. In an earlier paper, we described a text mining framework to facilitate the analysis of a vast number of learning statements. Our focus, in the earlier paper, was on describing the functionalities (i.e., data cleaning, data management, text analysis, and visualization results) of the framework and demonstrating one of the text quantification methods — term frequency — using the learning statements. In this paper, we focus on demonstrating another text quantification method, namely, text similarity, to facilitate instructors’ gaining new insights from students’ learning statements. In the method of text similarity, we measure the cosine distance between two text vectors and is typically used to compare the semantic similarity between documents. In this paper, we compare the similarity between what students learned (embodied in learning statements) and what instructors expected the students to learn (embodied in the course booklet), thus providing evidence-based guidance to instructors on how to improve the delivery of AME4163 – Principles of Engineering Design.
Can we provide evidence-based guidance to instructors to improve the delivery of the course based on students' reflection on doing? Over three years at the University of Oklahoma, Norman, US, we have collected about 18,000 Take-aways from almost 400 students who participated in an undergraduate design, build and test course. In this paper, we illustrate the efficacy of using the Latent Dirichlet Algorithm to respond to the question posed above. We describe a method to analyze the Take-aways using a Latent Dirichlet Allocation algorithm to extract topics from the Take-away data, and then relate the extracted topics to instructors' expectations using text similarity. By connecting and comparing what students learned (embodied in Take-aways) and what instructors expected the students to learn (embodied in stated Principles of Engineering Design), we provide evidence-based guidance to instructors on how to improve the delivery of AME4163: Principles of Engineering Design. Our objective in this paper is to introduce a method for quantifying text data to facilitate an instructor to modify the content and delivery of the next version of the course. The proposed method can be extended to other courses patterned after AME4163 to generate similar datasets covering student learning and instructor expectations.
How can instructors leverage assessment instruments in design, build, and test courses to simultaneously improve student outcomes and assess student learning to improve courses? A Take-away is one type of assessment method. It is unstructured text written by a student in AME4163: Principles of Engineering Design, the University of Oklahoma, Norman, US to record what they understand by reflecting on authentic, immersive experiences throughout the semester. The immersive experiences include lectures, assignments, reviews, building, testing, and a post-analysis for the design of an electro-mechanical system to address a given customer need. In the context of a Take-away, a student then writes a Learning Statement. The Learning Statement is a single sentence written as a triple, i.e., Experience|Learning|Value. Over the past three years at the University of Oklahoma (OU), we collected about 18,000 Take-aways and 18,000 Learning Statements from almost 400 students. In our earlier papers, we primarily concentrate on analyzing students’ Learning Statements by a text mining framework. In this paper, we focus on analyzing students’ Take-aways data using a Latent Dirichlet Allocation (LDA) algorithm, and then relate the Take-away data to the instructor’s expectations using text similarity. By connecting and comparing what students learned (embodied in Take-aways) and what instructors expected the students to learn (embodied in the course booklet), we provide evidence-based guidance to instructors on improving the delivery of AME4163: Principles of Engineering Design. The proposed method can be generalized to be used for the assessment of ABET Student Outcomes 2 and 7.
Farrokh co-directs the Systems Realization Laboratory @ OU with his wife Professor Janet K. Allen in Industrial and Systems Engineering. The Allen-Mistree research focus is on collaboratively defining the emerging frontier for the "intelligent" decision-based realization of complex (cyber-physical-social) systems when the computational models are incomplete and inaccurate. Their quest for answers to the key challenges are anchored in five research thrusts, namely, Contextual Assessment of Student Learning through Reflection on Doing Exploiting the Food-Energy-Water Nexus for Rural Development Integrated Realization of Engineered Materials, Products, and Associated Manufacturing Processes Knowledge-Based Dynamic Management of Multi-stage Complex Processes Knowledge-Based Management of Computational Complexity and Risk Knowledge-Based Platform for Decision Support in the Design of Engineered SystemsHis current education focus is on creating and implementing, in partnership with industry, a curriculum for educating strategic engineers-those who have developed the competencies to create value through the realization of complex engineered systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.