With thousands of learners watching the same online lecture videos, analyzing video watching patterns provides a unique opportunity to understand how students learn with videos. This paper reports a large-scale analysis of in-video dropout and peaks in viewership and student activity, using second-by-second user interaction data from 862 videos in four Massive Open Online Courses (MOOCs) on edX. We find higher dropout rates in longer videos, re-watching sessions (vs first-time), and tutorials (vs lectures). Peaks in rewatching sessions and play events indicate points of interest and confusion. Results show that tutorials (vs lectures) and re-watching sessions (vs first-time) lead to more frequent and sharper peaks. In attempting to reason why peaks occur by sampling 80 videos, we observe that 61% of the peaks accompany visual transitions in the video, e.g., a slide view to a classroom view. Based on this observation, we identify five student activity patterns that can explain peaks: starting from the beginning of a new material, returning to missed content, following a tutorial step, replaying a brief segment, and repeating a non-visual explanation. Our analysis has design implications for video authoring, editing, and interface design, providing a richer understanding of video learning on MOOCs.
massive open online courses (MOOCs) collect valuable data on student learning behavior; essentially complete records of all student interactions in a selfcontained learning environment, with the benefit of large sample sizes. Here, we offer an overview of how the 108,000 participants behaved in 6.002x -Circuits and Electronics, the first course in MITx (now edX) in the Spring 2012 semester. We divided participants into tranches based on the extent of their assessment activities, ranging from browsers (constituting ~76% of the participants but only 8% of the total time spent in the course) to certificate earners (7% of participants who accounted for 60% of total time). We examined how the certificate earners allocated their time among the various course components and what fraction of each they accessed. We analyze transitions between course components, showing how student behavior differs when solving homework vs. exam problems. This work lays the foundation for future studies of how various course components, and transitions among them, influence learning in MOOCs.Though free online courses are not new, 8 they have reached an unprecedented scale since late 2011. Three organizations-Coursera, edX, and Udacity-have released MOOCs 13 drawing more than 100,000 registrants per course. Numbers from these three initiatives have since grown to more than 100 courses and three million total registrants, resulting in 2012 being dubbed "The Year of the MOOC" by the New York Times. 16 Though there has been much speculation regarding how these initiatives may reshape higher education, 6,12,20 little analysis has been published to date describing student behavior or learning in them.Our main objective here is to show how the huge amount of data available in MOOCs offers a unique research opportunity, a means to study detailed student behavior in a self-contained learning environment throughout an Who Does What in a massive open online course?Data collected in moocs provides insight into student behavior, from weekly e-textbook reading habits to contextdependent use of learning resources when solving problems. in 6.002x, 76% of participants were browsers who collectively accounted for only 8% of time spent in the course, whereas, the 7% of certificate-earning participants averaged 100 hours each and collectively accounted for 60% of total time.Students spent the most time per week interacting with lecture videos and homework, followed by discussion forums and online laboratories; however, interactions with the videos and lecture questions were distinctly bimodal, with half the certificate earners accessing less than half of these resources.illuStration by anthony freda
We present results from a pilot study where students successfully created complex assessments for a MOOC in introductory electronics -an area with a very large expert-novice gap. Previous work in learnersourcing found that learners can productively contribute through simple tasks. However, many course resources require a high level of expertise to create, and prior work fell short on tasks with a large expert-novice gap, such as textbook creation or concept tagging. Since these constitute a substantial portion of course creation costs, addressing this issue is prerequisite to substantially shifting MOOC economics through learnersourcing. This represents one of the first successes in learnersourcing with a large expert-novice gap. In the pilot, we reached out to 206 students (out of thousands who met eligibility criteria) who contributed 14 complex high-quality design problems. This results suggests a full cohort could contribute hundreds of problems. We achieved this through a four-pronged approach: (1) pre-selecting top learners (2) community feedback process (3) student mini-course in pedagogy (4) instructor review and involvement.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.