Facial expression is a common channel for the communication of emotion. However, in the case of non-human animals, the analytical methods used to quantify facial expressions can be subjective, relying heavily on extrapolation from human-based systems. Here, we demonstrate how geometric morphometrics can be applied in order to overcome these problems. We used this approach to identify and quantify changes in facial shape associated with pain in a non-human animal species. Our method accommodates individual variability, species-specific facial anatomy, and postural effects. Facial images were captured at four different time points during ovariohysterectomy of domestic short haired cats (n = 29), with time points corresponding to varying intensities of pain. Images were annotated using landmarks specifically chosen for their relationship with underlying musculature, and relevance to cat-specific facial action units. Landmark data were subjected to normalisation before Principal Components (PCs) were extracted to identify key sources of facial shape variation, relative to pain intensity. A significant relationship between PC scores and a well-validated composite measure of post-operative pain in cats (UNESP-Botucatu MCPS tool) was evident, demonstrating good convergent validity between our geometric face model, and other metrics of pain detection. This study lays the foundation for the automatic, objective detection of emotional expressions in a range of non-human animal species.
The aim of this study was to investigate using existing image recognition techniques to predict the behavior of dairy cows. A total of 46 individual dairy cows were monitored continuously under 24 h video surveillance prior to calving. The video was annotated for the behaviors of standing, lying, walking, shuffling, eating, drinking and contractions for each cow from 10 h prior to calving. A total of 19,191 behavior records were obtained and a non-local neural network was trained and validated on video clips of each behavior. This study showed that the non-local network used correctly classified the seven behaviors 80% or more of the time in the validated dataset. In particular, the detection of birth contractions was correctly predicted 83% of the time, which in itself can be an early warning calving alert, as all cows start contractions several hours prior to giving birth. This approach to behavior recognition using video cameras can assist livestock management.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.