Receiving feedback on preliminary work allows content creators to gain insight and improve outcomes. However, a lack of commitment for gathering feedback and experiencing evaluation apprehension can delay feedback seeking. In this paper, we operationalize goal setting theory for planning feedback goals and test the effects on feedback seeking and revision. In an online experiment, participants (N=245) wrote an initial story after planning feedback goals (or not), submitted the story for feedback at a time of their choice, and revised the story based on feedback received. Participants anticipated feedback from a supervisor or peer to induce different levels of evaluation apprehension. We found that participants who planned proximal feedback goals sought feedback when their stories were less developed and revised the stories more after receiving feedback compared to when participants planned distant goals. Additionally, participants who anticipated feedback from a supervisor, regardless of goal planning, improved the quality of their stories the most. We did not find that goal setting or the provider's role affected evaluation apprehension. Our findings indicate that content creators should be guided to plan proximal feedback goals to encourage sharing of early drafts of creative work and receive feedback from someone in a position of higher perceived power to foster the most revision and improvement on those drafts.
Design instructors are integrating the use of online peer review platforms to keep pace with growing class sizes. However, these platforms typically prioritize randomized peer assignment strategies and show only the current solution to peers when writing feedback. This can result in low quality feedback in project-based design courses. We report on an experiment in which students (N=59) worked on twelve-week design projects and both wrote and received online feedback at four stages. The experiment tested a novel assignment strategy of peer mentorship, where peers were assigned to give feedback to all stages of the same project, and tested showing the context from the preceding design stage when composing feedback. The results showed that displaying the context from the preceding design stage led to feedback with higher perceived quality at the late design stages (but not at earlier stages) and feedback from mentors prompts longer responses from the feedback recipients. Our work contributes deeper empirical understanding of how assignment strategies and showing additional context affects peer feedback and provides practical guidelines for instructors to implement these methods in design courses.
Peer evaluations are a well-established tool for evaluating individual and team performance in collaborative contexts, but are susceptible to social and cognitive biases. Current peer evaluation tools have also yet to address the unique opportunities that online collaborative technologies provide for addressing these biases. In this work, we explore the potential of one such opportunity for peer evaluations: data traces automatically generated by collaborative tools, which we refer to as "activity traces". We conduct a between-subjects experiment with 101 students and MTurk workers, investigating the effects of reviewing activity traces on peer evaluations of team members in an online collaborative task. Our findings show that the usage of activity traces led participants to make more and greater revisions to their evaluations compared to a control condition. These revisions also increased the consistency and participants' perceived accuracy of the evaluations that they received. Our findings demonstrate the value of activity traces as an approach for performing more reliable and objective peer evaluations of teamwork. Based on our findings as well as qualitative analysis of free-form responses in our study, we also identify and discuss key considerations and design recommendations for incorporating activity traces into real-world peer evaluation systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.