Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work &Amp; Social Computing 2014
DOI: 10.1145/2531602.2531718
|View full text |Cite
|
Sign up to set email alerts
|

Reviewing versus doing

Abstract: In modern crowdsourcing markets, requesters face the challenge of training and managing large transient workforces. Requesters can hire peer workers to review others' work, but the value may be marginal, especially if the reviewers lack requisite knowledge. Our research explores if and how workers learn and improve their performance in a task domain by serving as peer reviewers. Further, we investigate whether peer reviewing may be more effective in teams where the reviewers can reach consensus through discuss… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 46 publications
(7 citation statements)
references
References 33 publications
0
6
1
Order By: Relevance
“…While other research from non-historical domains shows that it is possible for the crowd to learn a few microtasks in a short amount of time (e.g., < 30 minutes in total) (Dow, Kulkarni, Klemmer, & Hartmann, 2012;Lee, Lo, Kim, & Paulos, 2016;Zhu, Dow, Kraut, & Kittur, 2014), we did not observe this in our study of reading comprehension techniques. The wide adoption of long-term apprenticeship in historical research may help explain why we have different results (Law et al, 2017).…”
Section: Opportunities For History Educationcontrasting
confidence: 78%
“…While other research from non-historical domains shows that it is possible for the crowd to learn a few microtasks in a short amount of time (e.g., < 30 minutes in total) (Dow, Kulkarni, Klemmer, & Hartmann, 2012;Lee, Lo, Kim, & Paulos, 2016;Zhu, Dow, Kraut, & Kittur, 2014), we did not observe this in our study of reading comprehension techniques. The wide adoption of long-term apprenticeship in historical research may help explain why we have different results (Law et al, 2017).…”
Section: Opportunities For History Educationcontrasting
confidence: 78%
“…Some follow Schön's [63] reflection-in-action approach and argue that self-reflection during the writing alone can lead to better performance [41,81]. In contrast, others contend that aggregated crowd critique from peer workers also contributes to better performance [49,86]. The point here is not that we should adopt one or the other but to acknowledge that both could help provide better quality explanations, just as Dow et al [18] found that both self-assessment and external feedback led to better writing quality than when no feedback or assessment was enabled.…”
Section: 21mentioning
confidence: 99%
“…Cho and Schunn [5] used their SWoRD system for providing feedback on writing assignments to show that students receiving feedback from a group of novices had greater improvement on their next draft than students receiving feedback from a single expert, perhaps because students consider peer feedback carefully rather than blindly adhering to expert suggestions. Yuan et al [25] found that when students used rubrics to give critiques, their feedback was perceived to be as useful as expert feedback.…”
Section: Student and Instructor Feedbackmentioning
confidence: 99%
“…Future systems could explore other ways to model expert behavior and encourage relevant, critical, and actionable comments from students. For instance, many recent peer feedback systems explore the use of expert rubrics to scaffold the process [13,25]. Another idea would be to delegate to a student or a teaching assistant the job of transcribing the instructors' verbal comments, making them visible and available for further discussion in the digital system.…”
Section: Modeling Expert Behaviormentioning
confidence: 99%