2013
DOI: 10.1371/journal.pone.0069958
|View full text |Cite
|
Sign up to set email alerts
|

Comparing the Quality of Crowdsourced Data Contributed by Expert and Non-Experts

Abstract: There is currently a lack of in-situ environmental data for the calibration and validation of remotely sensed products and for the development and verification of models. Crowdsourcing is increasingly being seen as one potentially powerful way of increasing the supply of in-situ data but there are a number of concerns over the subsequent use of the data, in particular over data quality. This paper examined crowdsourced data from the Geo-Wiki crowdsourcing tool for land cover validation to determine whether the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
134
3

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 159 publications
(145 citation statements)
references
References 21 publications
2
134
3
Order By: Relevance
“…The quality of citizen derived data can be viewed from a variety of ways [57]. Many comparative studies have shown that crowdsourced geographic information can be as good, if not better, than data from authoritative sources [58,59]. A comprehensive literature overview of the latest developments in crowdsourced geographic information research is presented in Reference [60], with a focus on trends related to OpenStreetMap while many others have discussed the quality of this volunteer data source [61][62][63].…”
Section: Quality and Use Of The Data For Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…The quality of citizen derived data can be viewed from a variety of ways [57]. Many comparative studies have shown that crowdsourced geographic information can be as good, if not better, than data from authoritative sources [58,59]. A comprehensive literature overview of the latest developments in crowdsourced geographic information research is presented in Reference [60], with a focus on trends related to OpenStreetMap while many others have discussed the quality of this volunteer data source [61][62][63].…”
Section: Quality and Use Of The Data For Researchmentioning
confidence: 99%
“…The majority of sites reviewed fell into the first two categories, which implies that very little analysis of the crowdsourced geographic information can be undertaken in relation to the background of individuals. Some exceptions include research on contributors to OpenStreetMap [73] and Geo-Wiki [58].…”
Section: Information About Participantsmentioning
confidence: 99%
“…This paper extends the research described in See et al (2013) by considering the impact of distance between the volunteer's location and the location being analysed. It shows that distance has a minor effect on the reliability of the labelling performed by volunteers, and that expertise matters generally but not for all classes such as Shrub Cover.…”
Section: Discussionmentioning
confidence: 82%
“…When simple and clear instruction are provided to the crowds, See et al (2013) found that the crowds can improve the quality of information faster than the experts. This indicates the importance of designing effective means to aggregate information from the crowds.…”
Section: Discussionmentioning
confidence: 99%