2019 ACM/IEEE 14th International Conference on Global Software Engineering (ICGSE) 2019
DOI: 10.1109/icgse.2019.00041
|View full text |Cite
|
Sign up to set email alerts
|

An Empirical Study on Task Documentation in Software Crowdsourcing on TopCoder

Abstract: In the Software Crowdsourcingcompetitive model, crowd members seek for tasks in a platform and submit their solutions seeking rewards. In this model, the task description is important to support the choice and the execution of a task. Despite its importance, little is known about the role of task description as support for these processes. To fill this gap, this paper presents a study that explores the role of documentation on TopCoder platform, focusing on the task selection and execution. We conducted a two-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 11 publications
0
7
0
Order By: Relevance
“…Crowd participation was found to be influenced by the clarity of the description of the associated tasks [35]. Tasks with unclear objective description, without specifying required technologies or environment setup instructions, discourage developers from selecting them.…”
Section: F How To Crowdsource?mentioning
confidence: 99%
See 1 more Smart Citation
“…Crowd participation was found to be influenced by the clarity of the description of the associated tasks [35]. Tasks with unclear objective description, without specifying required technologies or environment setup instructions, discourage developers from selecting them.…”
Section: F How To Crowdsource?mentioning
confidence: 99%
“…In particular, the context of the case study is crowdsourced software development, whereas as cases and units of analysis we consider crowdsourced competitions hosted by the TopCoder platform. TopCoder platform is a pioneer for practicing Crowdsourced Software Engineering (CSE), systematically used for empirical research the last years [2], [39], [35].…”
Section: B Case Selectionmentioning
confidence: 99%
“…These platforms support variety of tasks and provide different facilities to their users in terms of remuneration, social recognition, bonuses and e-gifts [9], [10]. Few platforms support the specific tasks (Quicktate and iDictate for call auditing and Topcoder for programming), while most of the platforms facilitate their clients with a variety of tasks related to designing, programming and development, testing and quality assurance, interpretation and analysis and content writing [11], [12].…”
Section: Introductionmentioning
confidence: 99%
“…This model is suitable when high quality and diversified results are required by the client. Different crowdsourcing platforms implement competitions model e.g., Topcoder, 99designs, testbirds and uTests to crowdsource the development, designing, usability and system testing related tasks respectively [9], [12], [34].…”
Section: Introductionmentioning
confidence: 99%
“…Finding tasks to contribute to in Open Source projects is challenging [1,2,3,4,5]. Open tasks vary in complexity and required skills, which can be difficult to determine solely by reading the task descriptions alone, especially for new contributors [6,7,8]. Adding labels to the issues (a.k.a tasks, bug reports) help new contributors when they are choosing their tasks [9].…”
Section: Introductionmentioning
confidence: 99%