Videos of animal behavior are used to quantify researcher-defined behaviors-of-interest to study neural function, gene mutations, and pharmacological therapies. Behaviors-of-interest are often scored manually, which is time-consuming, limited to few behaviors, and variable across researchers. We created DeepEthogram: software that uses supervised machine learning to convert raw video pixels into an ethogram, the behaviors-of-interest present in each video frame. DeepEthogram is designed to be general-purpose and applicable across species, behaviors, and video-recording hardware. It uses convolutional neural networks to compute motion, extract features from motion and images, and classify features into behaviors. Behaviors are classified with above 90% accuracy on single frames in videos of mice and flies, matching expert-level human performance. DeepEthogram accurately predicts rare behaviors, requires little training data, and generalizes across subjects. A graphical interface allows beginning-to-end analysis without end-user programming. DeepEthogram's rapid, automatic, and reproducible labeling of researcher-defined behaviors-of-interest may accelerate and enhance supervised behavior analysis.
Researchers commonly acquire videos of animal behavior and quantify the prevalence of behaviors of interest to study nervous system function, the effects of gene mutations, and the efficacy of pharmacological therapies. This analysis is typically performed manually and is therefore immensely time consuming, often limited to a small number of behaviors, and variable across researchers. Here, we created DeepEthogram: software that takes raw pixel values of videos as input and uses machine learning to output an ethogram, the set of user-defined behaviors of interest present in each frame of a video. We used convolutional neural network models that compute motion in a video, extract features from motion and single frames, and classify these features into behaviors. These models classified behaviors with greater than 90% accuracy on single frames in videos of flies and mice, matching expert-level human performance. The models accurately predicted even extremely rare behaviors, required little training data, and generalized to new videos and subjects. DeepEthogram runs rapidly on common scientific computer hardware and has a graphical user interface that does not require programming by the end-user. We anticipate DeepEthogram will enable the rapid, automated, and reproducible assignment of behavior labels to every frame of a video, thus accelerating all those studies that quantify behaviors of interest.Code is available at: https://github.com/jbohnslav/deepethogram
The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, we demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely behaving mice, in the dark, over extended periods for continuous acquisition of 2 parallel video data streams: (1) near-infrared frustrated total internal reflection for detecting the degree, force, and timing of surface contact and (2) simultaneous ongoing video graphing of whole-body pose. Using machine vision and machine learning, we automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. We show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, we used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period and scalable for highthroughput preclinical analgesic efficacy assessment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.