2017
DOI: 10.1007/s10212-017-0345-x
|View full text |Cite|
|
Sign up to set email alerts
|

Using feedback requests to actively involve assessees in peer assessment: effects on the assessor’s feedback content and assessee’s agreement with feedback

Abstract: Criticizing the common approach of supporting peer assessment through providing assessors with an explication of assessment criteria, recent insights on peer assessment call for support focussing on assessees, who often assume a passive role of receivers of feedback. Feedback requests, which require assessees to formulate their specific needs for feedback, have therefore been put forward as an alternative to supporting peer assessment, even though there is little known about their exact impact on feedback. Ope… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 36 publications
0
3
0
Order By: Relevance
“…By providing students with carefully constructed rubrics and aggregating the assessments of four peers, their reliability and validity are even similar to those of teacher assessments (Cho, Schunn, & Wilson, 2006). In a setting that was less unidirectional and allowed some communication between feedback provider and https://doi.org/10.1016/j.learninstruc.2018.11.007 Received 30 January 2018; Received in revised form 9 November 2018; Accepted 25 November 2018 receiver, it was demonstrated that assessors included more informative elaborations in the peer assessment when they received specific feedback requests from their peers beforehand (Voet, Gielen, Boelens, & de Wever, 2017). What remains unclear, however, is whether the effect of the aforementioned initiatives on assessors' learning persists when the support is no longer offered.…”
Section: Introductionmentioning
confidence: 72%
“…By providing students with carefully constructed rubrics and aggregating the assessments of four peers, their reliability and validity are even similar to those of teacher assessments (Cho, Schunn, & Wilson, 2006). In a setting that was less unidirectional and allowed some communication between feedback provider and https://doi.org/10.1016/j.learninstruc.2018.11.007 Received 30 January 2018; Received in revised form 9 November 2018; Accepted 25 November 2018 receiver, it was demonstrated that assessors included more informative elaborations in the peer assessment when they received specific feedback requests from their peers beforehand (Voet, Gielen, Boelens, & de Wever, 2017). What remains unclear, however, is whether the effect of the aforementioned initiatives on assessors' learning persists when the support is no longer offered.…”
Section: Introductionmentioning
confidence: 72%
“…This seems to be an unresearched area. To our knowledge, the criteria 'evaluation', 'explanation' and 'suggestion' (or similar criteria) have only been used to analyse written peer feedback on academic writing [54][55][56][57] or concept maps [58]. Therefore, we also applied the feedback training to a writing task.…”
Section: Discussionmentioning
confidence: 99%
“…Prior studies have provided support measures , such as evaluation rubrics, worked examples, and templates with assessment criteria (Alemdag & Yildirim, 2022; Alqassab et al, 2018b; Gielen & De Wever, 2015; Peters et al, 2018; Rotsaert et al, 2018; Voet et al, 2018). Adaptive support measures might highlight or annotate the structural or content aspects of initial solutions—essentially analogous to the support measures for task processing.…”
Section: Nlp Adaptive Measures For Supporting the Peer‐feedback Processmentioning
confidence: 99%