Step counting is an effective method to assess the activity level of grazing sheep. However, existing step-counting algorithms have limited adaptability to sheep walking patterns and fail to eliminate false step counts caused by abnormal behaviors. Therefore, this study proposed a step-counting algorithm based on behavior classification designed explicitly for grazing sheep. The algorithm utilized regional peak detection and peak-to-valley difference detection to identify running and leg-shaking behaviors in sheep. It distinguished leg shaking from brisk walking behaviors through variance feature analysis. Based on the recognition results, different step-counting strategies were employed. When running behavior was detected, the algorithm divided the sampling window by the baseline step frequency and multiplied it by a scaling factor to accurately calculate the number of steps for running. No step counting was performed for leg-shaking behavior. For other behaviors, such as slow and brisk walking, a window peak detection algorithm was used for step counting. Experimental results demonstrate a significant improvement in the accuracy of the proposed algorithm compared to the peak detection-based method. In addition, the experimental results demonstrated that the average calculation error of the proposed algorithm in this study was 6.244%, while the average error of the peak detection-based step-counting algorithm was 17.556%. This indicates a significant improvement in the accuracy of the proposed algorithm compared to the peak detection method.
Fundamental sheep behaviours, for instance, walking, standing, and lying, can be closely associated with their physiological health. However, monitoring sheep in grazing land is complex as limited range, varied weather, and diverse outdoor lighting conditions, with the need to accurately recognise sheep behaviour in free range situations, are critical problems that must be addressed. This study proposes an enhanced sheep behaviour recognition algorithm based on the You Only Look Once Version 5 (YOLOV5) model. The algorithm investigates the effect of different shooting methodologies on sheep behaviour recognition and the model’s generalisation ability under different environmental conditions and, at the same time, provides an overview of the design for the real-time recognition system. The initial stage of the research involves the construction of sheep behaviour datasets using two shooting methods. Subsequently, the YOLOV5 model was executed, resulting in better performance on the corresponding datasets, with an average accuracy of over 90% for the three classifications. Next, cross-validation was employed to verify the model’s generalisation ability, and the results indicated the handheld camera-trained model had better generalisation ability. Furthermore, the enhanced YOLOV5 model with the addition of an attention mechanism module before feature extraction results displayed a mAP@0.5 of 91.8% which represented an increase of 1.7%. Lastly, a cloud-based structure was proposed with the Real-Time Messaging Protocol (RTMP) to push the video stream for real-time behaviour recognition to apply the model in a practical situation. Conclusively, this study proposes an improved YOLOV5 algorithm for sheep behaviour recognition in pasture scenarios. The model can effectively detect sheep’s daily behaviour for precision livestock management, promoting modern husbandry development.
The application of Static Random-Access Memory (SRAM), becomes more and more widely in aviation. However, the large amount of SRAM cells is very vulnerable to radiation included single-event upset (SEU). Based on the detection requirement of SRAM's SEU, the detected circuit of the SEU on SRAM is designed. Then the method of redundancy check is used in the reinforcement of the SEU. The test results shows that the detection circuit can detect the SRAM-type storage chip sensitive bit of the single particle and the proposed method can improved the performance of anti-single particles of SRAM several times as high as 1E6.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.