2019
DOI: 10.1371/journal.pone.0218614
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating the ability of citizen scientists to identify bumblebee (Bombus) species

Abstract: Citizen science is an increasingly popular way of engaging volunteers in the collection of scientific data. Despite this, data quality remains a concern and there is little published evidence about the accuracy of records generated by citizen scientists. Here we compare data generated by two British citizen science projects, Blooms for Bees and BeeWatch, to determine the ability of volunteer recorders to identify bumblebee ( Bombus ) species. We assessed recorders’ identification ability… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
58
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(65 citation statements)
references
References 43 publications
1
58
0
Order By: Relevance
“…Sharing unusual sightings with all volunteers or creating reporting incentives, such as prizes for the most sightings (Hochachka et al, 2012), may help maintain reporting levels for future projects. Species identification is of critical importance to this study and identification ability may not be consistent across all volunteers (Falk et al, 2019); mistakes may be more prevalent for species of similar appearance (e.g. spinner and spotted dolphins).…”
Section: Discussionmentioning
confidence: 99%
“…Sharing unusual sightings with all volunteers or creating reporting incentives, such as prizes for the most sightings (Hochachka et al, 2012), may help maintain reporting levels for future projects. Species identification is of critical importance to this study and identification ability may not be consistent across all volunteers (Falk et al, 2019); mistakes may be more prevalent for species of similar appearance (e.g. spinner and spotted dolphins).…”
Section: Discussionmentioning
confidence: 99%
“…This suggests that although more can be done, such as customized regional bee identification workshops or pollinator-friendly gardening guides, program organizers are on the right track with the provided features. We do not discuss the accuracy or perception of user and expert identification in this paper as has been done for other bumble bee community science programs (Lye et al, 2011;Suzuki-Ohno et al, 2017;Falk et al, 2019), but this information (including commonly mis-identified species) would also help to improve the program and data quality (but see MacPhail et al (unpublished data) for this).…”
Section: Percent Of Respondentsmentioning
confidence: 99%
“…It was launched in 2014 with a web-based platform (http://www.bumblebeewatch.org), with mobile applications for iOS and Android launched in 2017 and 2018 respectively. Based on the idea that bumble bees may, in many cases, be identifiable to species from photographs (Lye et al, 2011;Suzuki-Ohno et al, 2017;Falk et al, 2019), participants take photos of bumble bees and submit through the web or app, along with a date observed and location information. They have the option to use an interactive key or smart filter to assign species name (pre-limited by location), and regional experts verify the identifications.…”
Section: Introductionmentioning
confidence: 99%
“…Community science, which is also known as citizen science, is a popular tool in conservation biology that involves public participation in scientific data collection (Silvertown, 2009;Crall et al, 2011;Kremen, Ullmann & Thorp, 2011;Lye et al, 2011;Dickinson et al, 2012;Lebuhn et al, 2012;Roy et al, 2012;Birkin & Goulson, 2015;Kobori et al, 2016;Kosmala et al, 2016;McKinley et al, 2017). The main strengths of community science include increasing the spatial and temporal scale and magnitude of sampling efforts; reducing the cost of sampling; and creating educational and recreational benefits for participants (Bonney et al, 2009;Kremen, Ullmann & Thorp, 2011;Dickinson et al, 2012;Lebuhn et al, 2012;Birkin & Goulson, 2015;McKinley et al, 2017;Falk et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…Previous research has found that community scientist data, when unreviewed or unverified by experts, can contain errors that significantly influence the interpretation of results by experts (Gardiner et al, 2012;Comont & Ashbrook, 2017). Problematic errors include the overestimation of rare or at-risk species, as well as underestimation of common species, inflated species richness and significant increases in species diversity (Dickinson, Zuckerberg & Bonter, 2010;Gardiner et al, 2012;Silvertown et al, 2013;Comont & Ashbrook, 2017;Falk et al, 2019).…”
Section: Introductionmentioning
confidence: 99%