Manual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shifting) in farrowing crate. The system consisted of a low-cost 3D camera that simultaneously acquires digital and depth images and a software program that detects and identifies the sow's behaviors. This paper describes the computational algorithm for the analysis of depth images and presents its performance in recognizing the sow's behaviors as compared to manual recognition. The images were acquired at 6 s intervals on three days of a 21-day lactation period. Based on analysis of the 6 s interval images, the algorithm had the following accuracy of behavioral classification: 99.9% in lying, 96.4% in sitting, 99.2% in standing, 78.1% in kneeling, 97.4% in feeding, 92.7% in drinking, and 63.9% in transitioning between behaviors. The lower classification accuracy for the transitioning category presumably stemmed from insufficient frequency of the image acquisition which can be readily improved. Hence the reported system provides an effective way to automatically process and classify the sow's behavioral images. This tool is conducive to investigating behavioral responses and time budget of lactating sows and their litters to farrowing crate designs and management practices. RightsWorks produced by employees of the U.S. Government as part of their official duties are not copyrighted within the U.S. The content of this document is not copyrighted. b s t r a c tManual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shifting) in farrowing crate. The system consisted of a low-cost 3D camera that simultaneously acquires digital and depth images and a software program that detects and identifies the sow's behaviors. This paper describes the computational algorithm for the analysis of depth images and presents its performance in recognizing the sow's behaviors as compared to manual recognition. The images were acquired at 6 s intervals on three days of a 21-day lactation period. Based on analysis of the 6 s interval images, the algorithm had the following accuracy of behavioral classification: 99.9% in lying, 96.4% in sitting, 99.2% in standing, 78.1% in kneeling, 97.4% in feeding, 92.7% in drinking, and 63.9% in transitioning between behaviors. The lower classification accuracy for the transitioning category presumably stemmed from insufficient frequency of the image acquisition which can be readily improved. Hence the reported system provides an effective way to automatically process and classify the sow's behavioral images. This tool is conducive to investigating behavioral responses an...
Heat stress is one of the most important environmental stressors facing poultry production and welfare worldwide. The detrimental effects of heat stress on poultry range from reduced growth and egg production to impaired health. Animal vocalisations are associated with different animal responses and can be used as useful indicators of the state of animal welfare. It is already known that specific chicken vocalisations such as alarm, squawk, and gakel calls are correlated with stressful events, and therefore, could be used as stress indicators in poultry monitoring systems. In this study, we focused on developing a hen vocalisation detection method based on machine learning to assess their thermal comfort condition. For extraction of the vocalisations, nine source-filter theory related temporal and spectral features were chosen, and a support vector machine (SVM) based classifier was developed. As a result, the classification performance of the optimal SVM model was 95.1 ± 4.3% (the sensitivity parameter) and 97.6 ± 1.9% (the precision parameter). Based on the developed algorithm, the study illustrated that a significant correlation existed between specific vocalisations (alarm and squawk call) and thermal comfort indices (temperature-humidity index, THI) (alarm-THI, R = −0.414, P = 0.01; squawk-THI, R = 0.594, P = 0.01). This work represents the first step towards the further development of technology to monitor flock vocalisations with the intent of providing producers an additional tool to help them actively manage the welfare of their flock.
Due to the increasing scale of farms, it is increasingly difficult for farmers to monitor their animals in an automated way. Because of this problem, we focused on a sound technique to monitor laying hens. Sound analysis has become an important tool for studying the behaviour, health and welfare of animals in recent years. A surveillance system using microphone arrays of Kinects was developed for automatically monitoring birds’ abnormal vocalisations during the night. Based on the principle of time-difference of arrival (TDOA) of sound source localisation (SSL) method, Kinect sensor direction estimations were very accurate. The system had an accuracy of 74.7% in laboratory tests and 73.6% in small poultry group tests for different area sound recognition. Additionally, flocks produced an average of 40 sounds per bird during feeding time in small group tests. It was found that, on average, each normal chicken produced more than 53 sounds during the daytime (noon to 6:00 p.m.) and less than one sound at night (11:00 p.m.–3:00 a.m.). This system can be used to detect anomalous poultry status at night by monitoring the number of vocalisations and area distributions, which provides a practical and feasible method for the study of animal behaviour and welfare.
Abstract. In this paper, we adopt the QGWA-03 plant audio apparatus to investigate the sound effects on strawberry in the leaf area, the photosynthetic characteristics and other physiological indexes. It was found that when there were no significant differences between the circumstances of the two sunlight greenhouses, the strawberry after the sound wave stimulation grew stronger than in the control and its leaf were deeper green, and shifted to an earlier time about one week to blossom and bear fruit. It was also found that the resistance of strawberry against disease and insect pest were enhanced. The experiment results show that sound wave stimulation can certainly promote the growth of plants.
Pig weight and body size are important indicators for producers. Due to the increasing scale of pig farms, it is increasingly difficult for farmers to quickly and automatically obtain pig weight and body size. Due to this problem, we focused on a multiple output regression convolutional neural network (CNN) to estimate pig weight and body size. DenseNet201, ResNet152 V2, Xception and MobileNet V2 were modified into multiple output regression CNNs and trained on modeling data. By comparing the estimated performance of each model on test data, modified Xception was selected as the optimal estimation model. Based on pig height, body shape, and contour, the mean absolute error (MAE) of the model to estimate body weight (BW), shoulder width (SW), shoulder height (SH), hip width (HW), hip width (HH), and body length (BL) were 1.16 kg, 0.33 cm, 1.23 cm, 0.38 cm, 0.66 cm, and 0.75 cm, respectively. The coefficient of determination (R2) value between the estimated and measured results was in the range of 0.9879–0.9973. Combined with the LabVIEW software development platform, this method can estimate pig weight and body size accurately, quickly, and automatically. This work contributes to the automatic management of pig farms.
International audienceIn pig production, food conversion ratio and profit can be evaluated by real time detection of pig live weight. Traditional pig weight detections usually require direct contact with pigs, which are limited by its low efficiency and result in a lot of stresses even to death. The non-contact detection of pig body weight has become a challenge in pig production for decades. Digital image analysis and machine vision method enable the real time estimation of pig live weight by detecting pig critical body dimensions without any contact. This article elucidated the advantages and limitations of each detection method of pig body weight by comparing the system framework and estimation models. The research trends of contactless pig weight estimation were analyzed as well
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.