Manual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shifting) in farrowing crate. The system consisted of a low-cost 3D camera that simultaneously acquires digital and depth images and a software program that detects and identifies the sow's behaviors. This paper describes the computational algorithm for the analysis of depth images and presents its performance in recognizing the sow's behaviors as compared to manual recognition. The images were acquired at 6 s intervals on three days of a 21-day lactation period. Based on analysis of the 6 s interval images, the algorithm had the following accuracy of behavioral classification: 99.9% in lying, 96.4% in sitting, 99.2% in standing, 78.1% in kneeling, 97.4% in feeding, 92.7% in drinking, and 63.9% in transitioning between behaviors. The lower classification accuracy for the transitioning category presumably stemmed from insufficient frequency of the image acquisition which can be readily improved. Hence the reported system provides an effective way to automatically process and classify the sow's behavioral images. This tool is conducive to investigating behavioral responses and time budget of lactating sows and their litters to farrowing crate designs and management practices. RightsWorks produced by employees of the U.S. Government as part of their official duties are not copyrighted within the U.S. The content of this document is not copyrighted. b s t r a c tManual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shifting) in farrowing crate. The system consisted of a low-cost 3D camera that simultaneously acquires digital and depth images and a software program that detects and identifies the sow's behaviors. This paper describes the computational algorithm for the analysis of depth images and presents its performance in recognizing the sow's behaviors as compared to manual recognition. The images were acquired at 6 s intervals on three days of a 21-day lactation period. Based on analysis of the 6 s interval images, the algorithm had the following accuracy of behavioral classification: 99.9% in lying, 96.4% in sitting, 99.2% in standing, 78.1% in kneeling, 97.4% in feeding, 92.7% in drinking, and 63.9% in transitioning between behaviors. The lower classification accuracy for the transitioning category presumably stemmed from insufficient frequency of the image acquisition which can be readily improved. Hence the reported system provides an effective way to automatically process and classify the sow's behavioral images. This tool is conducive to investigating behavioral responses an...
Due to the increasing scale of farms, it is increasingly difficult for farmers to monitor their animals in an automated way. Because of this problem, we focused on a sound technique to monitor laying hens. Sound analysis has become an important tool for studying the behaviour, health and welfare of animals in recent years. A surveillance system using microphone arrays of Kinects was developed for automatically monitoring birds’ abnormal vocalisations during the night. Based on the principle of time-difference of arrival (TDOA) of sound source localisation (SSL) method, Kinect sensor direction estimations were very accurate. The system had an accuracy of 74.7% in laboratory tests and 73.6% in small poultry group tests for different area sound recognition. Additionally, flocks produced an average of 40 sounds per bird during feeding time in small group tests. It was found that, on average, each normal chicken produced more than 53 sounds during the daytime (noon to 6:00 p.m.) and less than one sound at night (11:00 p.m.–3:00 a.m.). This system can be used to detect anomalous poultry status at night by monitoring the number of vocalisations and area distributions, which provides a practical and feasible method for the study of animal behaviour and welfare.
The accurate and rapid detection of objects in videos facilitates the identification of abnormal behaviors in pigs and the introduction of preventive measures to reduce morbidity. In addition, accurate and effective pig detection algorithms provide a basis for pig behavior analysis and management decision-making. Monitoring the posture of pigs can enable the detection of the precursors of pig diseases in a timely manner and identify factors that impact pigs’ health, which helps to evaluate their health status and comfort. Excessive sitting represents abnormal behavior when pigs are frustrated in a restricted environment. The present study focuses on the automatic recognition of standing posture and lying posture in grouped pigs, which shows a lack of recognition of sitting posture. The main contributions of this paper are as follows: A human-annotated dataset of standing, lying, and sitting postures captured by 2D cameras during the day and night in a pig barn was established, and a simplified copy, paste, and label smoothing strategy was applied to solve the problem of class imbalance caused by the lack of sitting postures among pigs in the dataset. The improved YOLOX has an average precision with an intersection over union threshold of 0.5 (AP0.5) of 99.5% and average precision with an intersection over union threshold of 0.5–0.95 (AP0.5–0.95) of 91% in pig position detection; an AP0.5 of 90.9% and an AP0.5–0.95 of 82.8% in sitting posture recognition; a mean average precision with intersection over union threshold of 0.5 (mAP0.5) of 95.7% and a mean average precision with intersection over union threshold of 0.5–0.95 (mAP0.5–0.95) of 87.2% in all posture recognition. The method proposed in our study can improve the position detection and posture recognition of grouped pigs effectively, especially for pig sitting posture recognition, and can meet the needs of practical application in pig farms.
A seamless 3G (3rd generation) access system was designed and implemented to meet the remote video monitoring and remote equipment control requirement in facility agriculture without a wired connection to the Internet. The system consists of a wireless communication module including 3G SIM card and wireless router and a VPN (virtual private network) module, and uses L2TP (layer 2 tunneling protocol) and IP security to borrow lines from VPN server to solve the problem of determining a private 3G NAT (network address transaction) address. It provides sufficient video bandwidth and cheap wireless network access solutions; has characteristics of low cost, high universality, good extensibility and can be easily implemented. This system has now been put into operation in the school base, whereby we can monitor environmental parameters remotely, control the base server device, and watch smooth, clear, real-time video (video server uses H.264 format for video compression and code rate for 256-800Kbps preset).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.