We investigated how landmarks influence the brain’s computation of head direction and found that in a bi-directionally symmetrical environment, some neurons in dysgranular retrosplenial cortex showed bi-directional firing patterns. This indicates dominance of neural activity by local environmental cues even when these conflict with the global head direction signal. It suggests a mechanism for associating landmarks to or dissociating them from the head direction signal, according to their directional stability/utility.
Entorhinal grid cells integrate sensory and self-motion inputs to provide a spatial metric of a characteristic scale. One function of this metric may be to help localize the firing fields of hippocampal place cells during formation and use of the hippocampal spatial representation (“cognitive map”). Of theoretical importance is the question of how this metric, and the resulting map, is configured in 3D space. We find here that when the body plane is vertical as rats climb a wall, grid cells produce stable, almost-circular grid-cell firing fields. This contrasts with previous findings when the body was aligned horizontally during vertical exploration, suggesting a role for the body plane in orienting the plane of the grid cell map. However, in the present experiment, the fields on the wall were fewer and larger, suggesting an altered or absent odometric (distance-measuring) process. Several physiological indices of running speed in the entorhinal cortex showed reduced gain, which may explain the enlarged grid pattern. Hippocampal place fields were found to be sparser but unchanged in size/shape. Together, these observations suggest that the orientation and scale of the grid cell map, at least on a surface, are determined by an interaction between egocentric information (the body plane) and allocentric information (the gravity axis). This may be mediated by the different sensory or locomotor information available on a vertical surface and means that the resulting map has different properties on a vertical plane than a horizontal plane (i.e., is anisotropic).
Neural encoding of navigable space involves a network of structures centered on the hippocampus, whose neurons –place cells – encode current location. Input to the place cells includes afferents from the entorhinal cortex, which contains grid cells. These are neurons expressing spatially localized activity patches, or firing fields, that are evenly spaced across the floor in a hexagonal close-packed array called a grid. It is thought that grids function to enable the calculation of distances. The question arises as to whether this odometry process operates in three dimensions, and so we queried whether grids permeate three-dimensional (3D) space – that is, form a lattice – or whether they simply follow the environment surface. If grids form a 3D lattice then this lattice would ordinarily be aligned horizontally (to explain the usual hexagonal pattern observed). A tilted floor would transect several layers of this putative lattice, resulting in interruption of the hexagonal pattern. We model this prediction with simulated grid lattices, and show that the firing of a grid cell on a 40°-tilted surface should cover proportionally less of the surface, with smaller field size, fewer fields, and reduced hexagonal symmetry. However, recording of real grid cells as animals foraged on a 40°-tilted surface found that firing of grid cells was almost indistinguishable, in pattern or rate, from that on the horizontal surface, with if anything increased coverage and field number, and preserved field size. It thus appears unlikely that the sloping surface transected a lattice. However, grid cells on the slope displayed slightly degraded firing patterns, with reduced coherence and slightly reduced symmetry. These findings collectively suggest that the grid cell component of the metric representation of space is not fixed in absolute 3D space but is influenced both by the surface the animal is on and by the relationship of this surface to the horizontal, supporting the hypothesis that the neural map of space is “multi-planar” rather than fully volumetric.
How the brain represents represent large-scale, navigable space has been the topic of intensive investigation for several decades, resulting in the discovery that neurons in a complex network of cortical and subcortical brain regions co-operatively encode distance, direction, place, movement etc. using a variety of different sensory inputs. However, such studies have mainly been conducted in simple laboratory settings in which animals explore small, two-dimensional (i.e., flat) arenas. The real world, by contrast, is complex and three dimensional with hills, valleys, tunnels, branches, and—for species that can swim or fly—large volumetric spaces. Adding an additional dimension to space adds coding challenges, a primary reason for which is that several basic geometric properties are different in three dimensions. This article will explore the consequences of these challenges for the establishment of a functional three-dimensional metric map of space, one of which is that the brains of some species might have evolved to reduce the dimensionality of the representational space and thus sidestep some of these problems.
The regular firing pattern exhibited by medial entorhinal (mEC) grid cells of locomoting rodents is hypothesized to provide spatial metric information relevant for navigation. The development of virtual reality (VR) for head-fixed mice confers a number of experimental advantages and has become increasingly popular as a method for investigating spatially-selective cells. Recent experiments using 1D VR linear tracks have shown that some mEC cells have multiple fields in virtual space, analogous to grid cells on real linear tracks. We recorded from the mEC as mice traversed virtual tracks featuring regularly spaced repetitive cues and identified a population of cells with multiple firing fields, resembling the regular firing of grid cells. However, further analyses indicated that many of these were not, in fact, grid cells because: (1) when recorded in the open field they did not display discrete firing fields with six-fold symmetry; and (2) in different VR environments their firing fields were found to match the spatial frequency of repetitive environmental cues. In contrast, cells identified as grid cells based on their open field firing patterns did not exhibit cue locking. In light of these results we highlight the importance of controlling the periodicity of the visual cues in VR and the necessity of identifying grid cells from real open field environments in order to correctly characterize spatially modulated neurons in VR experiments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.