Tracking insect movement in a social group (such as ants) is challenging, because they are not only visually identical but also likely to perform intensive body contact and sudden movement adjustment (start/stop, direction turning). To address this challenge, we introduced an online multi-object tracking framework by combining both the motion and appearance information of ants. We obtained the appearance descriptors by using the ResNet model for offline training on a small (N=50) sample dataset. For online association, cosine similarity metric computes the matching degree between historical appearance sequences of the trajectory and the current detection. We validated our method in both indoor (lab-setup) and outdoor video sequences. The results show that the accuracy and precision of the model are 99.22%±0.37% and 91.93%±1.46% across 46041 testing samples, with real-time tracking performance. Additionally, we offered a public dataset of ant tracking with 46091 samples for future research in relevant domains.
Background The motion and interaction of social insects (such as ants) have been studied by many researchers to understand clustering mechanisms. Most studies in the field of ant behavior have focused only on indoor environments (a laboratory setup), while outdoor environments (natural environments) are still underexplored. Findings In this article, we collect 10 videos of 3 species of ant colonies from different scenes, including 5 indoor and 5 outdoor scenes. We develop an image sequence marking software named VisualMarkData, which enables us to provide annotations of the ants in the videos. (i) It offers comprehensive annotations of states at the individual-target and colony-target levels. (ii) It provides a simple matrix format to represent multiple targets and multiple groups of annotations (along with their IDs and behavior labels). (iii) During the annotation process, we propose a simple and effective visualization that takes the annotation information of the previous frame as a reference, and then a user can simply click on the center point of each target to complete the annotation task. (iv) We develop a user-friendly window-based GUI to minimize labor and maximize annotation quality. In all 5,354 frames, the location information and the identification number of each ant are recorded for a total of 712 ants and 114,112 annotations. Moreover, we provide visual analysis tools to assess and validate the technical quality and reproducibility of our data. Conclusions We provide a large-scale ant dataset with the accompanying annotation software. It is hoped that our work will contribute to a deeper exploration of the behavior of ant colonies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.