2021
DOI: 10.1109/tlt.2021.3058644
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating the Quality of Learning Resources: A Learnersourcing Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 29 publications
(16 citation statements)
references
References 50 publications
0
14
0
1
Order By: Relevance
“…Prior studies assessing such qualitative measures, however, have involved significant manual effort by experts. In learnersourcing contexts, aggregating student ratings to assess the quality of questions is scalable and agrees well with expert ratings of quality (Abdi et al 2021;Darvishi, Khosravi, and Sadiq 2021;McQueen et al 2014). In our own work, we use averaged student ratings as groundtruth labels when training our AQQR models.…”
Section: Related Workmentioning
confidence: 74%
See 1 more Smart Citation
“…Prior studies assessing such qualitative measures, however, have involved significant manual effort by experts. In learnersourcing contexts, aggregating student ratings to assess the quality of questions is scalable and agrees well with expert ratings of quality (Abdi et al 2021;Darvishi, Khosravi, and Sadiq 2021;McQueen et al 2014). In our own work, we use averaged student ratings as groundtruth labels when training our AQQR models.…”
Section: Related Workmentioning
confidence: 74%
“…A more scalable solution is to have students review and evaluate the content themselves (Darvishi, Khosravi, and Sadiq 2021). Prior research has shown that students can make similar quality judgments to experts, especially when the assessments provided by multiple students are aggregated (Abdi et al 2021). However, a sufficient number of students must view and evaluate each artefact before a valid assessment can be produced, which is inefficient.…”
Section: Introductionmentioning
confidence: 99%
“…As students are developing their expertise, it is likely that some of the learning resources created are ineffective, inappropriate or incorrect. Hence, RiPPLE utilises an evaluation process that again partners with students as moderators to judge the quality of their peers' work (Abdi et al, 2021). Figure 1 illustrates the evaluation interfaces used by the platform.…”
Section: Overview Of Ripple Platformmentioning
confidence: 99%
“…Here we provide a brief overview of the study. A fuller account is available in Abdi, Khosravi, Sadiq, and Demartini (2021).…”
Section: Case Study 1: An Observational Study To Investigate Students' Evaluative Judgmentmentioning
confidence: 99%