2018
DOI: 10.1287/isre.2018.0775
|View full text |Cite
|
Sign up to set email alerts
|

Salience Bias in Crowdsourcing Contests

Abstract: Crowdsourcing relies on online platforms to connect a community of users to perform specific tasks. However, without appropriate control, the behavior of the online community might not align with the platform's designed objective, which can lead to an inferior platform performance. This paper investigates how the feedback information on a crowdsourcing platform and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, using archival data from the online crowdsourcing platfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 60 publications
(36 citation statements)
references
References 45 publications
(44 reference statements)
0
35
0
Order By: Relevance
“…For instance, the research conducted by Lee, Ba, Li, and Stallaert (2018) in this issue examines the performance of crowdsourcing contests on Kaggle. The authors show that despite the well-reported merits of using open contests for research and development tasks previously performed within internal business units (Boudreau and Lakhani 2013), the performance of these contests is very much dependent on in-progress feedback to the contestants.…”
Section: Online Labor Platformsmentioning
confidence: 99%
“…For instance, the research conducted by Lee, Ba, Li, and Stallaert (2018) in this issue examines the performance of crowdsourcing contests on Kaggle. The authors show that despite the well-reported merits of using open contests for research and development tasks previously performed within internal business units (Boudreau and Lakhani 2013), the performance of these contests is very much dependent on in-progress feedback to the contestants.…”
Section: Online Labor Platformsmentioning
confidence: 99%
“…To identify and examine projects that may exhibit both attributes of both communities and crowds, we used an in-depth analysis of archival data obtained through the GitHub platform. Similar archival methods have been used in other open innovation studies of both communities (Roberts, Hann, & Slaughter, 2006) and crowds (Lee, Ba, Li, & Stallaert, 2018). More broadly, the study of open source has been a popular setting for the study of open innovation (Dahlander & Magnusson, 2008;West & Lakhani, 2008).…”
Section: Methodsmentioning
confidence: 99%
“…Archival data also helps in understanding the changing needs and requirements posted on the portals across different geographies. However, studies have also reported that data validation is key for archival data to improve the robustness of the results (Lee et al, 2018). We then contacted website owners seeking permission to send a questionnaire to the participants in these tasks.…”
Section: Data Collectionmentioning
confidence: 99%