Before the onset of locomotion, the hippocampus undergoes a transition into an activity-state specialized for the processing of spatially related input. This brain-state transition is associated with increased firing rates of CA1 pyramidal neurons and the occurrence of theta oscillations, which both correlate with locomotion velocity. However, the neural circuit by which locomotor activity is linked to hippocampal oscillations and neuronal firing rates is unresolved. Here we reveal a septo-hippocampal circuit mediated by glutamatergic (VGluT2(+)) neurons that is activated before locomotion onset and that controls the initiation and velocity of locomotion as well as the entrainment of theta oscillations. Moreover, via septo-hippocampal projections onto alveus/oriens interneurons, this circuit regulates feedforward inhibition of Schaffer collateral and perforant path input to CA1 pyramidal neurons in a locomotion-dependent manner. With higher locomotion speed, the increased activity of medial septal VGluT2 neurons is translated into increased axo-somatic depolarization and higher firing rates of CA1 pyramidal neurons. VIDEO ABSTRACT.
Quantification and detection of the hierarchical organization of behavior is a major challenge in neuroscience. Recent advances in markerless pose estimation enable the visualization of high-dimensional spatiotemporal behavioral dynamics of animal motion. However, robust and reliable technical approaches are needed to uncover underlying structure in these data and to segment behavior into discrete hierarchically organized motifs. Here, we present an unsupervised probabilistic deep learning framework that identifies behavioral structure from deep variational embeddings of animal motion (VAME). By using a mouse model of beta amyloidosis as a use case, we show that VAME not only identifies discrete behavioral motifs, but also captures a hierarchical representation of the motif’s usage. The approach allows for the grouping of motifs into communities and the detection of differences in community-specific motif usage of individual mouse cohorts that were undetectable by human visual observation. Thus, we present a robust approach for the segmentation of animal motion that is applicable to a wide range of experimental setups, models and conditions without requiring supervised or a-priori human interference.
The medial septum and diagonal band of Broca (MSDB) send glutamatergic axons to medial entorhinal cortex (MEC). We found that this pathway provides speed-correlated input to several MEC cell-types in layer 2/3. The speed signal is integrated most effectively by pyramidal cells but also excites stellate cells and interneurons. Thus, the MSDB conveys speed information that can be used by MEC neurons for spatial representation of self-location.
Naturalistic behavior is highly complex and dynamic. Approaches aiming at understanding how neuronal ensembles generate behavior require robust behavioral quantification in order to correlate the neural activity patterns with behavioral motifs. Here, we present Variational Animal Motion Embedding (VAME), a probabilistic machine learning framework for discovery of the latent structure of animal behavior given an input time series obtained from markerless pose estimation tools. To demonstrate our framework we perform unsupervised behavior phenotyping of APP/PS1 mice, an animal model of Alzheimer disease. Using markerless pose estimates from open-field exploration as input VAME uncovers the distribution of detailed and clearly segmented behavioral motifs. Moreover, we show that the recovered distribution of phenotype-specific motifs can be used to reliably distinguish between APP/PS1 and wildtype mice, while human experts fail to classify the phenotype based on the same video observations. We propose VAME as a versatile and robust tool for unsupervised quantification of behavior across organisms and experimental settings Keywords Neuroscience · Behavior Quantification · Machine Learning · Variational Bayes · Manifold * Equal contribution of last and second last author. Identifying Behavioral Structure from Deep Variational Embeddings of Animal MotionSeveral computational approaches for unsupervised behavior quantification have been introduced (Berman, Choi, Bialek, & Shaevitz, 2014; Wiltschko et al., 2015;Batty et al., 2019). These methods advanced the field of unsupervised behavior quantification and established an increasing awareness of the necessity to improve objectivity. Most approaches operate on a dimensionality-reduced signal extracted directly from the tracking video. The signal is then learned by a machine learning model in the time-domain (Wiltschko et al., 2015;Batty et al., 2019) or frequency-time domain (Berman, 2018) and segmented into discrete blocks containing similar chunks of input data.Recently, pose estimation tools such as DeepLabCut (Mathis et al., 2018) and LEAP (T. D. Pereira et al., 2019) enabled efficient tracking of animal body-parts via supervised deep learning. The robustness of deep neural networks allows the application of these tools for pose estimation in many model systems, such as mice, zebrafish and flies, and allows for a high generalization betweeen datasets (Mathis et al., 2018). However, while such tools provide a continuous representation of the animal body motion, the extraction of underlying discrete states as a basis for classification (Tinbergen, 1951) remains a challenge.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.