LAK21: 11th International Learning Analytics and Knowledge Conference 2021
DOI: 10.1145/3448139.3448146
|View full text |Cite
|
Sign up to set email alerts
|

Transforming Everyday Information into Practical Analytics with Crowdsourced Assessment Tasks

Abstract: Educators use a wide variety of data to inform their practices. Examples of these data include forms of information that are commonplace in schools, such as student work and paper-based artifacts. One limitation in these situations is that there are less efficient ways to process such everyday varieties of information into analytics that are more usable and practical for educators. To explore how to address this constraint, we describe two sets of design experiments that utilize crowdsourced tasks for scoring … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 39 publications
0
4
0
Order By: Relevance
“…These performance measures, and many more, had they been observed, would have been coded as measuring learning. A study that examined different methods for assessing middle school students' use of evidence and systems thinking was coded as measuring learning (Ahn et al, 2021). Two studies that examined children's dialogue during collaborative problem solving were both coded as measuring learning (Emara et al, 2021;Ma et al, 2022).…”
Section: Discussionmentioning
confidence: 99%
“…These performance measures, and many more, had they been observed, would have been coded as measuring learning. A study that examined different methods for assessing middle school students' use of evidence and systems thinking was coded as measuring learning (Ahn et al, 2021). Two studies that examined children's dialogue during collaborative problem solving were both coded as measuring learning (Emara et al, 2021;Ma et al, 2022).…”
Section: Discussionmentioning
confidence: 99%
“…Crowdsourcing has proven useful to cheaply label large SA datasets used in applications outside of education (Heidari and Shamsinejad 2020). In the educational context crowdsourcing has been utilized for the design and use of crowdsourced learning analytics tasks (Ahn et al 2021), as well as, to interpret learners' reviews of MOOCs (Li et al 2022), but neither to gather sentiment labels nor to serve as a hands-on learning activity for students themselves.…”
Section: Background and Related Workmentioning
confidence: 99%
“…• construct assessments collaboratively through e.g., crowdsourcing of assessment tasks (Ahn et al, 2021); and…”
Section: Rq2: What Are the Main Ai Use Cases Relating To Assessments?mentioning
confidence: 99%
“…Research by Ahn et al (2021) also showed that automated grading of learners' work, which contains complex data, rich semantic meaning and idiosyncratic and local nuances, may not be well graded by present computational approaches that utilize metrics such as counts of parts of speech and essay length as proxies for writing complexity and quality.…”
mentioning
confidence: 99%