Directional properties of the sound transformation at the ear of four intact echolocating bats, Eptesicus fuscus, were investigated via measurements of the head-related transfer function (HRTF). Contributions of external ear structures to directional features of the transfer functions were examined by remeasuring the HRTF in the absence of the pinna and tragus. The investigation mainly focused on the interactions between the spatial and the spectral features in the bat HRTF. The pinna provides gain and shapes these features over a large frequency band (20-90 kHz), and the tragus contributes gain and directionality at the high frequencies (60 to 90 kHz). Analysis of the spatial and spectral characteristics of the bat HRTF reveals that both interaural level differences (ILD) and monaural spectral features are subject to changes in sound source azimuth and elevation. Consequently, localization cues for horizontal and vertical components of the sound source location interact. Availability of multiple cues about sound source azimuth and elevation should enhance information to support reliable sound localization. These findings stress the importance of the acoustic information received at the two ears for sound localization of sonar target position in both azimuth and elevation.
Head and eye movements incessantly modulate the luminance signals impinging onto the retina during natural intersaccadic fixation. Yet, little is known about how these fixational movements influence the statistics of retinal stimulation. Here, we provide the first detailed characterization of the visual input to the human retina during normal head-free fixation. We used high-resolution recordings of head and eye movements in a natural viewing task to examine how they jointly transform spatial information into temporal modulations. In agreement with previous studies, we report that both the head and the eyes move considerably during fixation. However, we show that fixational head and eye movements mostly compensate for each other, yielding a spatiotemporal redistribution of the input power to the retina similar to that previously observed under head immobilization. The resulting retinal image motion counterbalances the spectral distribution of natural scenes, giving temporal modulations that are equalized in power over a broad range of spatial frequencies. These findings support the proposal that "ocular drift," the smooth fixational motion of the eye, is under motor control, and indicate that the spatiotemporal reformatting caused by fixational behavior is an important computational element in the encoding of visual information.
Summary Humans explore static visual scenes by alternating rapid eye movements (saccades) with periods of slow and incessant eye drifts [1–3]. These drifts are commonly believed to be the consequence of physiological limits in maintaining steady gaze, resulting in Brownian-like trajectories [4–7], which are almost independent in the two eyes [8–10]. However, because of the technical difficulty of recording minute eye movements, most knowledge on ocular drift comes from artificial laboratory conditions, in which the head of the observer is strictly immobilized. Little is known about eye drift during natural head-free fixation, when microscopic head movements are also continually present [11–13]. We have recently observed that the power spectrum of the visual input to the retina during ocular drift is largely unaffected by fixational head movements [14]. Here we elucidate the mechanism responsible for this invariance. We show that, contrary to common assumption, ocular drift does not move the eyes randomly, but compensates for microscopic head movements, thereby yielding highly correlated movements in the two eyes. This compensatory behavior is extremely fast, persists with one eye patched, and results in image motion trajectories that are only partially correlated on the two retinas. These findings challenge established views of how humans acquire visual information. They show that ocular drift is precisely controlled, as long speculated [15], and imply the existence of neural mechanisms that integrate minute multimodal signals.
Sound localization is known to be a complex phenomenon, combining multisensory information processing, experience-dependent plasticity and movement. Here we present a sensorimotor model that addresses the question of how an organism could learn to localize sound sources without any a priori neural representation of its head related transfer function (HRTF) or prior experience with auditory spatial information. We demonstrate quantitatively that the experience of the sensory consequences of its voluntary motor actions allows an organism to learn the spatial location of any sound source. Using examples from humans and echolocating bats, our model shows that a naive organism can learn the auditory space based solely on acoustic inputs and their relation to motor states.
This study examined behavioral strategies for texture discrimination by echolocation in free-flying bats. Big brown bats, Eptesicus fuscus, were trained to discriminate a smooth 16 mm diameter object (S+) from a size-matched textured object (S−), both of which were tethered in random locations in a flight room. The bat's three-dimensional flight path was reconstructed using stereo images from high-speed video recordings, and the bat's sonar vocalizations were recorded for each trial and analyzed off-line. A microphone array permitted reconstruction of the sonar beam pattern, allowing us to study the bat's directional gaze and inspection of the objects. Bats learned the discrimination, but performance varied with S−. In acoustic studies of the objects, the S+ and S− stimuli were ensonified with frequency-modulated sonar pulses. Mean intensity differences between S+ and S− were within 4 dB. Performance data, combined with analyses of echo recordings, suggest that the big brown bat listens to changes in sound spectra from echo to echo to discriminate between objects. Bats adapted their sonar calls as they inspected the stimuli, and their sonar behavior resembled that of animals foraging for insects. Analysis of sonar beam-directing behavior in certain trials clearly showed that the bat sequentially inspected S+ and S−.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.