Sound localization plays a critical role in animal survival. Three cues can be used to compute sound direction: interaural timing differences (ITDs), interaural level differences (ILDs) and the direction-dependent spectral filtering by the head and pinnae (spectral cues). Little is known about how spectral cues contribute to the neural encoding of auditory space. Here we report on auditory space encoding in the mouse superior colliculus (SC). We show that the mouse SC contains neurons with spatially-restricted receptive fields (RFs) that form an azimuthal topographic map. We found that frontal RFs require spectral cues and lateral RFs require ILDs. The neurons with frontal RFs have frequency tunings that match the spectral structure of the specific head and pinna filter for sound coming from the front. These results demonstrate that patterned spectral cues in combination with ILDs give rise to the topographic map of azimuthal auditory space.
Sound localization plays a critical role in animal survival. To compute the incident sound direction, the animal can use three cues: interaural timing differences (ITDs), interaural level differences (ILDs) and the direction-dependent spectral filtering of the sound by the head and pinnae (spectral cues). Compared to ITDs and ILDs, little is known about how spectral cues contribute to the neural encoding of auditory space. Here we report on auditory space encoding in the mouse superior colliculus (SC). We show that the mouse SC contains neurons with spatially-restricted receptive fields (RFs) that form a topographic map of azimuthal auditory space. By eliminating each sound localization cue from the stimuli, we found that nasal RFs require spectral cues and temporal RFs require ILDs. Therefore, the lack of either cue results in the disruption of the azimuthal topographic map. These results demonstrate an unexpected role of spectral cues in azimuthal sound localization.
Sensory information from different modalities is processed in parallel, and then integrated in associative brain areas to improve object identification and the interpretation of sensory experiences. The Superior Colliculus (SC) is a midbrain structure that plays a critical role in integrating visual, auditory, and somatosensory input to assess saliency and promote action. Although the response properties of the individual SC neurons to visuoauditory stimuli have been characterized, little is known about the spatial and temporal dynamics of the integration at the population level. Here we recorded the response properties of SC neurons to spatially restricted visual and auditory stimuli using large-scale electrophysiology. We then created a general, population-level model that explains the spatial, temporal, and intensity requirements of stimuli needed for sensory integration. We found that the mouse SC contains topographically organized visual and auditory neurons that exhibit nonlinear multisensory integration. We show that nonlinear integration depends on properties of auditory but not visual stimuli. We also find that a heuristically derived nonlinear modulation function reveals conditions required for sensory integration that are consistent with previously proposed models of sensory integration such as spatial matching and the principle of inverse effectiveness.
A topographic map of auditory space is a feature found in the superior colliculus (SC) of many species, including CBA/CaJ mice. In this genetic background, high-frequency monaural spectral cues and interaural level differences (ILDs) are used to compute spatial receptive fields (RFs) that form a topographic map along the azimuth. Unfortunately, C57BL/6 mice, a strain widely used for transgenic manipulation, display age-related hearing loss (AHL) because of an inbred mutation in the Cadherin 23 gene ( Cdh23 ) that affects hair cell mechanotransduction. To overcome this problem, researchers have used young C57BL/6 mice in their studies, as they have been shown to have normal hearing thresholds. However, important details of the auditory response characteristics of the SC such as spectral responses and spatial localization, have not been characterized in young C57BL/6 mice. Here, we show that two- to four-month C57BL/6 mice lack neurons with frontal auditory RFs and therefore lack a topographic representation of auditory space in the SC. Analysis of the spectrotemporal RFs (STRFs) of the SC auditory neurons shows that C57BL/6 mouse SC neurons lack the ability to detect the high-frequency (>40 kHz) spectral cues that are needed to compute frontal RFs. We also show that crossing C57BL/6 mice with CBA/CaJ mice or introducing one copy of the wild-type Cdh23 to C57BL/6 mice rescues the high-frequency hearing deficit and improves the topographic map of auditory space. Taken together, these results demonstrate the importance of high-frequency hearing in computing a topographic map of auditory space.
Sensory information from different modalities is processed in parallel, and then integrated in associative brain areas to improve object identification and the interpretation of sensory experiences. The Superior Colliculus (SC) is a midbrain structure that plays a critical role in integrating visual, auditory, and somatosensory input to assess saliency and promote action. Although the response properties of the individual SC neurons to visuoauditory stimuli have been characterized, little is known about the spatial and temporal dynamics of the integration at the population level. Here we recorded the response properties of SC neurons to spatially restricted visual and auditory stimuli using large-scale electrophysiology. We then created a general, population-level model that explains the spatial, temporal, and intensity requirements of stimuli needed for sensory integration. We found that the mouse SC contains topographically organized visual and auditory neurons that exhibit nonlinear multisensory integration. We show that nonlinear integration depends on properties of auditory but not visual stimuli. We also find that a heuristically derived nonlinear modulation function reveals conditions required for sensory integration that are consistent with previously proposed models of sensory integration such as spatial matching and the principle of inverse effectiveness.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.