2018
DOI: 10.1002/bes2.1388
|View full text |Cite
|
Sign up to set email alerts
|

Biased Assumptions and Oversimplifications in Evaluations of Citizen Science Data Quality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(19 citation statements)
references
References 19 publications
0
19
0
Order By: Relevance
“…Despite data quality concerns (Kosmala et al 2016;Aceves-Bueno et al 2017;Specht and Lewandowski 2018), citizen science has great potential to address pressing matters in biodiversity monitoring, conservation and research (Theobald et al 2015;Chandler et al 2017;Pocock et al 2018). Open access to citizen science data would maximise this potential through increased reuse and the application of new 'big data' techniques and crossdisciplinary studies (Culina et al 2018b;Farley et al 2018;Ma et al 2018;Tulloch et al 2018), as well as yielding benefits of increased transparency and public trust in science (Soranno et al 2015).…”
Section: Citizen Scientist Support For Open Accessmentioning
confidence: 99%
“…Despite data quality concerns (Kosmala et al 2016;Aceves-Bueno et al 2017;Specht and Lewandowski 2018), citizen science has great potential to address pressing matters in biodiversity monitoring, conservation and research (Theobald et al 2015;Chandler et al 2017;Pocock et al 2018). Open access to citizen science data would maximise this potential through increased reuse and the application of new 'big data' techniques and crossdisciplinary studies (Culina et al 2018b;Farley et al 2018;Ma et al 2018;Tulloch et al 2018), as well as yielding benefits of increased transparency and public trust in science (Soranno et al 2015).…”
Section: Citizen Scientist Support For Open Accessmentioning
confidence: 99%
“…Through this canvassing, they can also update eligibility information within the tool to provide better program estimates and validate the modeling. Issues with residents sharing personal information and validating any responses are still expected, but are not different than those encountered with professional data (Specht and Lewandowski, 2018).…”
Section: Community Eligibility and Methods Validation Toolmentioning
confidence: 99%
“…The consequences of this trade-off are a topic of active research (Aceves-Bueno et al 2017;Bayraktarov et al 2018;Kelling et al 2018;Specht & Lewandowski 2018;Boersch-Supan, Trask & Baillie 2019;Johnston et al 2020;Robinson et al 2020), and there is a growing set of modelling approaches to address the challenges of unstructured datasets using auxiliary structured biodiversity data and/or observation models that account for preferential sampling, usually at the cost of increased model complexity and computational demands (van Strien, van Swaay & Termaat 2013;Fithian et al 2015;Robinson, Ruiz-Gutierrez & Fink 2018;Isaac et al 2019;Johnston et al 2019Johnston et al , 2020.…”
Section: Boersch-supan and Robinson 2021mentioning
confidence: 99%