In a scenario of worldwide honey bee decline, assessing colony strength is becoming increasingly important for sustainable beekeeping. Temporal counts of number of comb cells with brood and food reserves offers researchers data for multiple applications, such as modelling colony dynamics, and beekeepers information on colony strength, an indicator of colony health and honey yield. Counting cells manually in comb images is labour intensive, tedious, and prone to error. Herein, we developed a free software, named DeepBee©, capable of automatically detecting cells in comb images and classifying their contents into seven classes. By distinguishing cells occupied by eggs, larvae, capped brood, pollen, nectar, honey, and other, DeepBee© allows an unprecedented level of accuracy in cell classification. Using Circle Hough Transform and the semantic segmentation technique, we obtained a cell detection rate of 98.7%, which is 16.2% higher than the best result found in the literature. For classification of comb cells, we trained and evaluated thirteen different convolutional neural network (CNN) architectures, including: DenseNet (121, 169 and 201); InceptionResNetV2; InceptionV3; MobileNet; MobileNetV2; NasNet; NasNetMobile; ResNet50; VGG (16 and 19) and Xception. MobileNet revealed to be the best compromise between training cost, with~9 s for processing all cells in a comb image, and accuracy, with an F1-Score of 94.3%. We show the technical details to build a complete pipeline for classifying and counting comb cells and we made the CNN models, source code, and datasets publicly available. With this effort, we hope to have expanded the frontier of apicultural precision analysis by providing a tool with high performance and source codes to foster improvement by third parties (https://github.com/AvsThiago/DeepBeesource).
Honey bees are key insect pollinators, providing important economic and ecological value for human beings and ecosystems. This has triggered the development of several monitoring methods for assessing the temporal development of colony size, food storage, brood and pathogens. Nonetheless, most of these methods are based on visual assessments that are observer-dependent and prone to bias. Furthermore, the impact on colony development (invasiveness), as well as accuracy, were rarely considered when implementing new methods. In this study, we present and test a novel accurate and observer-independent method for honey bee colony assessment, capable of being fully standardized. Honey bee colony size is quantified by assessing the weight of adult bees, while brood and provision are assessed by taking photos and conducting image analysis of the combs with the image analysis software DeepbeeV R . The invasiveness and accuracy of the method were investigated using field data from two experimental apiaries in Portugal, comparing results from test and control colonies. At the end of each field experiment, most of the tested colonies had the same colony size, brood levels and honey production as the control colonies. Nonetheless, continuous weight data indicated some disturbance in tested colonies in the first year of monitoring. The overall accuracy of the image analysis software was improved by training, indicating that it is possible to adapt the software to local conditions. We conclude that the use of this fully quantitative method offers a more accurate alternative to classic visual colony assessments, with negligible impact on colony development.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.