2020
DOI: 10.1038/s41598-020-67658-3
|View full text |Cite
|
Sign up to set email alerts
|

Data-derived metrics describing the behaviour of field-based citizen scientists provide insights for project design and modelling bias

Abstract: Around the world volunteers and non-professionals collect data as part of environmental citizen science projects, collecting wildlife observations, measures of water quality and much more. However, where projects allow flexibility in how, where, and when data are collected there will be variation in the behaviour of participants which results in biases in the datasets collected. We develop a method to quantify this behavioural variation, describing the key drivers and providing a tool to account for biases in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
34
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(40 citation statements)
references
References 52 publications
0
34
0
Order By: Relevance
“…Bias in citizen science projects can be introduced by allowing individuals flexibility in how, when and where to collect data 47 . We reduced the risk of bias prior UAV operations by (1) assigning groups to fixed locations, (2) targeting low-tide for survey, (3) providing standardised protocols and training on simple to use highly automated equipment and (4) assisting citizen scientists at each sites for the first three flights.…”
Section: Resultsmentioning
confidence: 99%
“…Bias in citizen science projects can be introduced by allowing individuals flexibility in how, when and where to collect data 47 . We reduced the risk of bias prior UAV operations by (1) assigning groups to fixed locations, (2) targeting low-tide for survey, (3) providing standardised protocols and training on simple to use highly automated equipment and (4) assisting citizen scientists at each sites for the first three flights.…”
Section: Resultsmentioning
confidence: 99%
“…Research can contribute greatly to inform the standardisation of methods and to improve data analysis. Where observers are numerous and providing repeat observations methods from species distribution modelling can be applied to understand bias and improve model accuracy (August et al 2020 ; Isaac et al 2014 ; Renner et al 2015 ). For plant health issues, it remains particularly important to understand the components that affect detection (namely the probability to detect, probability to identify and probability to report the issue), so that inferences can be made quickly when new threats are discovered.…”
Section: Discussionmentioning
confidence: 99%
“…These reports occur at times and places where it is convenient for the public to participate and, as such, are unstructured and potentially biased in distribution (Isaac and Pocock 2015 ; Baker et al 2018 ; Johnston et al 2020 ). This systematic error contravenes a fundamental assumption of most statistical approaches (that data is a representative random sample of the wider population) (Dobson et al 2020 ; August et al 2020 ). For analysts and policy makers there are clear limitations to how these data can be used, but there is no denying the practical usefulness of additional detections to plant health officials (Ryan et al 2018 ).…”
Section: Introductionmentioning
confidence: 93%
See 1 more Smart Citation
“…The complexity of the human intelligence emulation of natural phenomena by data-driven systems highlights the need to acquire knowledge, extracting conceptual schemes and complex data correlations directly from the primary source, which is the observed data of the studied phenomenon [39][40][41] .…”
mentioning
confidence: 99%