2016
DOI: 10.1002/cjas.1395
|View full text |Cite
|
Sign up to set email alerts
|

Comparing crowdsourcing initiatives: Toward a typology development

Abstract: Although numerous studies have examined the crowdsourcing phenomenon, little consensus exists regarding the classification of distinct types of activities within crowdsourcing. In this study, we identify and classify 12 crowdsourcing initiatives that comprise the key categories of crowdsourcing: Crowdpedia, Fansourcing, Crowdnetworking, Crowdsharing, Crowdvoting, Crowdfunding, Ideation, Open Innovation, User Innovation, Scisourcing, Crowd‐Relief, and Open Source Software. Our main objective is to establish the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 45 publications
(109 reference statements)
0
4
0
Order By: Relevance
“…Ali-Hassan and Allam 60 Comparing crowdsourcing initiatives toward a typology development 12 crowdsourcing initiatives are identified and classified by the authors, which comprise the main categories of crowdsourcing. The main aim is establishment of the similarities and differences between the basic crowdsourcing initiatives and then develops topology based on various crowdsourcing dimensions which will provide a roadmap on which the researcher can anchor their research.…”
Section: Reference Methods Descriptionmentioning
confidence: 99%
See 1 more Smart Citation
“…Ali-Hassan and Allam 60 Comparing crowdsourcing initiatives toward a typology development 12 crowdsourcing initiatives are identified and classified by the authors, which comprise the main categories of crowdsourcing. The main aim is establishment of the similarities and differences between the basic crowdsourcing initiatives and then develops topology based on various crowdsourcing dimensions which will provide a roadmap on which the researcher can anchor their research.…”
Section: Reference Methods Descriptionmentioning
confidence: 99%
“…20 In crowdsourcing contests, the participation history factors that include participation recency and frequency, winning recency and frequency, and tenure and last performance of the competitors are also derived in a study. 21 One of the authors suggests "personalized task recommendation" approach, which aims the matching of the worker interest with the appropriate task and thus makes the 10 Task and participant matching 0.5 0.5 0 0.5 1.5 Nassar and Karray 3 Overview of the crowdsourcing process 0.5 1 0 1 2.5 Javadi Khasraghi and Aghaie 21 Crowdsourcing contests 1 1 0.5 1 3.5 Zheng et al 58 Task design in crowdsourcing 0.5 0.5 0.5 0.5 2 Sales Fonteles et al 59 Trajectory recommendation of tasks 1 0.5 0.5 0.5 2.5 Burnap et al 30 Identifying experts in the crowd for evaluation of engineering designs 0.5 0.5 0.5 0.5 2 Ali-Hassan and Allam 60 Comparing crowdsourcing initiatives toward a typology development 0 0 0 1 1 Wang et al 61 Mobile crowdsourcing framework, challenges, and solutions 0 0 0 1 1 Ghezzi et al 62 Crowdsourcing review 0 0.5 0 1 1.5 Xintong et al 63 Brief survey of crowdsourcing for data mining 0 0 0 1 1 Pournajaf et al 38 Crowd sensing task assignment 0 0 0 1 1 Tarasov et al 64 Worker reliability in crowdsourcing 0 0 0 1 1 Baba et al 39 Improper task detection in crowdsourcing 0.5 0.5 0.5 1 2.5 Geiger and Schader 22 Ptask recommendation in crowdsourcing 0.5 0.5 0 1 2 Hosseini et al 16 Crowdsourcing: A taxonomy and systematic mapping study 0.5 0.5 0 0.5 1.5 Ye and Kankanhalli 65 Organizational task crowdsourcing 0.5 0 0 1 1.5 Ellero et al 66 Real-time crowdsourcing 0 0 0.5 1 1.5 Mao et al 1 Crowdsourcing survey in software engineering 0.5 0.5 0.5 1 2.5 Morschheuser et al 67 Gamified crowdsourcing conceptualization, literature review, and future agenda 0 0 0 1 1 Safran and Che 29 Real-time recommendation algorithms for crowdsourcing systems 1 1 1 1 4 Younas et al 68 Optimal task assignment 0.5 0 1 0.5 2 Harman and Azzam 69 Crowdsourcing criteria and standards 0.5 0 0 0.5 1 Moayedikia et al 31 Task assignment in crowdsourcing platforms 0.5 0 1 0 1.5 Wu et al 70 Task allocation in crowdsourcing 0.5 0.5 1 0 2 Sarı et al…”
Section: What Are the Task Assignment Framework/models Available For Effective Crowdsourcing?mentioning
confidence: 99%
“…There is little consensus in the literature concerning categorising various crowdsourcing activities (Ali-Hassan & Allam, 2016). Howe (2008), for example, proposed four types of crowdsourcing: crowd wisdom or collective intelligence, crowd creation or user-generated content, crowd voting, and crowdfunding (Brabham, 2013).…”
Section: An Overview Of the Literaturementioning
confidence: 99%
“…When it comes to intermediaries, it involves problems connected to information technology, digitalisation and idea management. At the same time, from the crowd's point of view, it relates to the problem of motivating the public to participate (Ali-Hassan & Allam, 2016). Moreover, crowdsourcing is also a social, spatially embedded, and interactive process of learning which cannot be understood outside of its institutional and cultural context (Ye et al, 2012).…”
Section: An Overview Of the Literaturementioning
confidence: 99%