In a relatively short period of time, massive open online courses (MOOCs) have become a considerable topic of research and debate, and the number of available MOOCs is rapidly growing. Along with issues of formal recognition and accreditation, this growth in the number of MOOCs being developed increases the relevance of assessment quality. Within the context of a typical xMOOC, the current study focuses on peer assessment of essay assignments. In the literature, two contradicting theoretical arguments can be found: that learners should be matched with same‐ability peers (homogeneously) versus that students should be matched with different‐ability peers (heterogeneously). Considering these arguments, the relationship between peer reviewers’ ability and authors’ essay performance is explored. Results indicate that peer reviewers’ ability is positively related to authors’ essay performance. Moreover, this relationship is only established for intermediate and high ability authors; essay performance of lower ability authors appeared not to be related to the ability of their reviewing peers. Results are discussed in relation to the matching of learners and instructional design of peer assessment in MOOCs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.