2018
DOI: 10.1016/j.biosystemseng.2018.09.011
|View full text |Cite
|
Sign up to set email alerts
|

Automatic recognition of sow nursing behaviour using deep learning-based segmentation and spatial and temporal features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
29
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(31 citation statements)
references
References 35 publications
0
29
0
Order By: Relevance
“…Big data and machine learning have been used for the analysis of animal behavior [ 133 ]. Computer vision-based methods have been employed for automatic recognition in commercial farms using spatial and temporal information of the nursing behavior of animals and individual pig recognition, with accuracy rates of 96.7% [ 134 , 135 ].…”
Section: Limitations Of Sensing Technologiesmentioning
confidence: 99%
“…Big data and machine learning have been used for the analysis of animal behavior [ 133 ]. Computer vision-based methods have been employed for automatic recognition in commercial farms using spatial and temporal information of the nursing behavior of animals and individual pig recognition, with accuracy rates of 96.7% [ 134 , 135 ].…”
Section: Limitations Of Sensing Technologiesmentioning
confidence: 99%
“…From this, we can see the importance of temporal features for recognition. Sows were segmented from all frames through the FCN model to extract spatial features; then, the temporal features were designed and extracted, and the classifier was used to classify nursing behavior finally [23]. The method of this paper can extract spatial and temporal features directly through training and is end-to-end.…”
Section: Feeding Scratching Mounting Lying Walkingmentioning
confidence: 99%
“…Zheng et al [2] and Yang et al [22] used Faster Region-Convolutional Neural Networks (Faster-RCNN), which can detect pigs effectively to recognize pig postures and feeding behaviors. Sows were segmented from all frames through the Fully Convolutional Network (FCN) model, which could help recognize sows' nursing behavior with an accuracy of 97.6% [23]. Nasirahmadi et al [16] proposed three detector methods including Faster R-CNN, single shot multibox detector (SSD), and region-based fully convolutional network (R-FCN) to recognize the standing, lying on side, and lying on belly postures of pigs with a mean average precision (mAP) of over 0.93.…”
Section: Introductionmentioning
confidence: 99%
“…For previous research on identification, References [11,16-18, 20-22,24-26,29,31,38,44-48] for detection and References [10,19,37,39,45] for tracking exist. In addition, previous studies for early detection of abnormalities exist as various topics, including research on the movement of pigs [17,62], research on aggressive behavior of pigs [63,64], research on attitude change [16,22,23,31,32,34,35,40,46], research on mounting behavior [21], research on low-growth pig's behavior [49], research on pig weight [29,33,38] and research on the density of pigs [9,11].…”
Section: Introductionmentioning
confidence: 99%