Small automatically assessed programming assignments are an often used resource for learning programming. Creating sufficiently large amounts of such assignments is, however, time consuming. As a consequence, offering large quantities of practice assignments to students is not always possible. CrowdSorcerer is an embeddable open-source system that students and teachers alike can use for creating and evaluating small automatically assessed programming assignments. While creating programming assignments, the students also write simple input-output-tests, and are gently introduced to the basics of testing. Students can also evaluate the assignments of others and provide feedback on them, which exposes them to code written by others early in their education. In this article we both describe the CrowdSorcerer system and our experiences in using the system in a large undergraduate programming course. Moreover, we discuss the motivation for crowdsourcing course assignments and present some usage statistics. CCS CONCEPTS • Information systems → Crowdsourcing; • Human-centered computing → Collaborative content creation; • Social and professional topics → Computing education;
We have used a tool called CrowdSorcerer that allows students to create programming assignments. The students are given a topic by a teacher, after which the students design a programming assignment: the assignment description, the code template, a model solution and a set of input-output -tests. The created assignments are peer reviewed by other students on the course. We study students' peer reviews to these student-generated assignments, focusing on examining the differences between novice and experienced programmers. We then analyze whether the exercises created by experienced programmers are rated better quality-wise than those created by novices. Additionally, we investigate the differences between novices and experienced programmers as peer reviewers: can novices review assignments as well as experienced programmers? CCS CONCEPTS• Information systems → Crowdsourcing; • Social and professional topics → Computing education;
Crowdsourcing refers to the act of using the crowd to create content or to collect feedback on some particular tasks or ideas. Within computer science education, crowdsourcing has been used-for example-to create rehearsal questions and programming assignments. As a part of their computer science education, students often learn relational databases as well as working with the databases using SQL statements. In this article, we describe a system for practicing SQL statements. The system uses teacher-provided topics and assignments, augmented with crowdsourced assignments and reviews. We study how students use the system, what sort of feedback students provide to the teacher-generated and crowdsourced assignments, and how practice affects the feedback. Our results suggest that students rate assignments highly, and there are only minor differences between assignments generated by students and assignments generated by the instructor. CCS CONCEPTS • Information systems → Crowdsourcing; • Applied computing → Interactive learning environments; • Social and professional topics → Computing education.
Identifying people based on their typing has been studied successfully in multiple different contexts. Previous research has shown that identification is possible based on writing predetermined texts such as typing passwords, free text such as essays, as well based on writing source code. In this work, we study typing pattern based identification when the text format and writing environment change. We replicate two earlier studies which suggested that typing profile identification works with programming data, and that it can be applied to a programming exam circumstances with decent results. Then, we examine how the identification accuracy changes when the user profiles are built using data from programming, and the identification is conducted on data from writing free text. Our results show that the identification accuracy is indeed high within the context of programming data, but drops when identifying essay typists based on typing profiles built from their programming data.
Crowdsourcing is a method of collecting services, ideas, materials or other artefacts from a relatively large and open group of people. Crowdsourcing has been used in computer science education to alleviate the teachers' workload in creating course content, and as a learning and revision method for students through its use in educational systems. Tools that utilize crowdsourcing can act as a great way for students to further familiarize themselves with the course concepts, all while creating new content for their peers and future course iterations. In my research, I focus on investigating the effects of computing education systems that use crowdsoucing on students' learning, and the types of quality assurance methods required to use the artefacts students produce with these tools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.