2017
DOI: 10.3390/su9112019
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourcing Analysis of Twitter Data on Climate Change: Paid Workers vs. Volunteers

Abstract: Web based crowdsourcing has become an important method of environmental data processing. Two alternatives are widely used today by researchers in various fields: paid data processing mediated by for-profit businesses such as Amazon's Mechanical Turk, and volunteer data processing conducted by amateur citizen-scientists. While the first option delivers results much faster, it is not quite clear how it compares with volunteer processing in terms of quality. This study compares volunteer and paid processing of so… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 34 publications
(43 reference statements)
0
7
0
Order By: Relevance
“…Sections 5.4 and 5.5 exploit the experts' consensus using it as the ground truth for the analysis of the quality of data collected from the volunteer and paid worker communities. This analysis allows us to compare the performances of these two communities, which is increasingly being discussed within the citizen science research community (e.g., [31]). Furthermore, in Section 5.6, we show how our probabilistic model can also be leveraged for a predictive analysis to estimate the number of annotations required to reach the desired accuracy for each of the three communities we worked with.…”
Section: Evaluating Data Quality In Highly Uncertain Scenariosmentioning
confidence: 99%
“…Sections 5.4 and 5.5 exploit the experts' consensus using it as the ground truth for the analysis of the quality of data collected from the volunteer and paid worker communities. This analysis allows us to compare the performances of these two communities, which is increasingly being discussed within the citizen science research community (e.g., [31]). Furthermore, in Section 5.6, we show how our probabilistic model can also be leveraged for a predictive analysis to estimate the number of annotations required to reach the desired accuracy for each of the three communities we worked with.…”
Section: Evaluating Data Quality In Highly Uncertain Scenariosmentioning
confidence: 99%
“…Availability of a committed crowd within the given time [33] and vast volumes of data [34] is often a challenge and a limiting factor for crowdsourcing initiatives. While federating existing communities [35] can ensure commitment, the volume of tasks could hinder the community in sustaining the interest.…”
Section: Limitations Of the Proposed Approachmentioning
confidence: 99%
“…Paid crowdsourcing platforms like Amazon Mechanical Turk are designed to handle equally complex tasks and provide a timely, scalable workforce. Studies reveal that higher redundancy and limiting the focus to a specific geographic area can achieve a similar quality of result output [33], [36] in a much lesser time when compared to the volunteering crowd. Future work will address the quality control mechanisms in designing the experiment in a paid crowdsourcing platform.…”
Section: Limitations Of the Proposed Approachmentioning
confidence: 99%
“…First, it should be noted that despite its broad adoption, Twitter is not representative of the general population (Mellon & Prosser, 2017; but see Kirilenko et al, 2017 for evidence that it is more diverse than typical study populations such as undergraduate students or MTurk workers(. Furthermore, even within the population of Twitter users, individuals who feel uncertain might feel less of an urge to communicate this on social media; in contrast, individuals who decide to cope with threats in an active way might use Twitter to communicate messages of certainty.…”
Section: Potential Limitations and Future Directionsmentioning
confidence: 99%