2021
DOI: 10.1007/s10796-021-10176-y
|View full text |Cite
|
Sign up to set email alerts
|

A Conceptualisation of Crowd Knowledge

Abstract: Propelled by digitalisation, crowd knowledge (CK) has gained popularity alongside a plurality of related crowd-based concepts (crowdsourcing, wisdom of crowds and collective intelligence), resulting in an inconsistent understanding of the terms and their application. Based on a structured literature review, we conceptualise CK and develop a formal definition, which is then evaluated using knowledge artefacts on different crowd-related platforms and differentiation criteria in relation to participants, context,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 44 publications
0
1
0
Order By: Relevance
“…Crowdsourcing in a school context has already been evaluated by Zualkernan [57,58] and Qutaifan et al [59], who investigated the ability of teachers in a developing country to create and contribute high-quality multiple-choice questions for primary school students, thereby increasing crowd knowledge, which according to Blesik et al [60] can be defined as "a collaborative aggregation of context-dependent information contributed and used by participants that is stored in an artefact and provided to fulfil a purpose". According to Ren [61] and Malhotra and Kubowicz [62], the number and quality of new ideas received from a crowd can be increased by providing a diverse set of examples of good and bad ideas, as well as a series of modifications of initial ideas.…”
Section: Discussionmentioning
confidence: 99%
“…Crowdsourcing in a school context has already been evaluated by Zualkernan [57,58] and Qutaifan et al [59], who investigated the ability of teachers in a developing country to create and contribute high-quality multiple-choice questions for primary school students, thereby increasing crowd knowledge, which according to Blesik et al [60] can be defined as "a collaborative aggregation of context-dependent information contributed and used by participants that is stored in an artefact and provided to fulfil a purpose". According to Ren [61] and Malhotra and Kubowicz [62], the number and quality of new ideas received from a crowd can be increased by providing a diverse set of examples of good and bad ideas, as well as a series of modifications of initial ideas.…”
Section: Discussionmentioning
confidence: 99%
“…Within a crowdsourcing platform, such as Amazon Mechanical Turk (MTurk), job requesters publish microtasks through an open call to an undefined number of workers (Blesik et al, 2021). These microtasks are called Human Intelligence Tasks (HITs), and typically are easier for humans to solve rather than for computers, but require the participation of the crowd due to their large volume.…”
Section: Crowdwork and Microtask Catching Scriptsmentioning
confidence: 99%