Purpose The events surrounding the COVID-19 crisis had a profound effect on higher education, forcing students and instructors to face a sudden transition to wholly online learning contexts. This paper aims to examine how the design of a residential course was adapted to an online context and how this adaptation may prove beneficial to future iterations of the course. Design/methodology/approach This analysis centers on a master’s-level course in which students design software to support learning. One of the major changes to the course revolves around the transition from a traditional rubric-based grading scheme to a specifications grading system. This latter approach provides a series of binary (pass/fail) requirements (specifications) that students must meet to pass. Various forms of interactions were also altered during the transition; the authors investigate these in the paper. Findings This study found that the move to specifications grading helped students and the instructor to focus on the important work of meeting course learning goals. The approach also aligned well with authentic scenarios in which software projects are tested against certain specifications. Finally, this study concludes that thinking about specifications grading in the future can help us to develop more resilient pedagogical design approaches that respond to various forms of disruptions and changes. Originality/value The course design insights described in this paper illustrate alternative ways of instruction that can be especially useful during times of emergency, but which may also provide an added level of authenticity and learner motivation during times of stability.
We compared discussion posts from a data science ethics MOOC that was hosted on two platforms. We characterized one platform as “open” because learners can respond to discussion prompts while viewing and responding to others. We characterized the other platform as “locked” because learners must respond to a discussion prompt before they can view and respond to others. Our objective is to determine whether these platform differences are consequential and have the potential to impact learning. We analyzed direct responses to two discussion prompts from two modules located in modules two and six of an eight module course. We used conventional content analysis to derive codes directly from the data. Posts on the “open” platform were characterized by failure to completely address the prompt and showed evidence of persuasion tactics and reflective activity. Posts on the “locked” platform were characterized by an apparent intent to complete the task and an assertive tone. Posts on the “locked” platform also showed a diversity of ideas through the corpus of responses. Our findings show that MOOC platform interfaces can lead to qualitative differences in discussion posts in ways that have the potential to impact learning. Our study provides insight into how “open” and “locked” platform designs have the potential to shape ways that learners respond to discussion prompts in MOOCs. Our study offers guidance for instructors making decisions on MOOC platform choice and activities situated within a learning experience.We used conventional content analysis to derive codes directly from the data. Posts on the “open” platform were characterized by failure to completely address the prompt and showed evidence of persuasion tactics and reflective activity. Posts on the “locked” platform were characterized by an apparent intent to complete the task and an assertive tone. Posts on the “locked” platform also showed a diversity of ideas through the corpus of responses. Our findings show that MOOC platform interfaces can lead to qualitative differences in discussion posts in ways that have the potential to impact learning. Our study provides insight into how “open” and “locked” platform designs have the potential to shape ways that learners respond to discussion prompts in MOOCs. Our study offers guidance for instructors making decisions on MOOC platform choice and activities situated within a learning experience.
The rapid, global spread of COVID-19 has led to an unprecedented rise in enrollments in online learning experiences among learners of all ages. In this article, we explore the impact of the global pandemic on a massive open online course, Problem Solving Using Computational Thinking, with a particular focus on the topics learners chose for their final projects. The Computational Thinking MOOC was designed using a project-based learning approach and aims to provide learners with an introduction to the "big ideas" of computational thinking using a range of case studies that encompass topics such as airport surveillance, epidemiology, and human trafficking. Beyond observing a sharp increase in enrollment and engagement at the time the pandemic began, we discuss ways in which the course's project-based pedagogy allowed learners to bring their present
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.