Thaler L, Milne JL, Arnott SR, Kish D, Goodale MA. Neural correlates of motion processing through echolocation, source hearing, and vision in blind echolocation experts and sighted echolocation novices. J Neurophysiol 111: 112-127, 2014. First published October 16, 2013 doi:10.1152/jn.00501.2013.-We have shown in previous research (Thaler L, Arnott SR, Goodale MA. PLoS One 6: e20162, 2011) that motion processing through echolocation activates temporal-occipital cortex in blind echolocation experts. Here we investigated how neural substrates of echo-motion are related to neural substrates of auditory source-motion and visual-motion. Three blind echolocation experts and twelve sighted echolocation novices underwent functional MRI scanning while they listened to binaural recordings of moving or stationary echolocation or auditory source sounds located either in left or right space. Sighted participants' brain activity was also measured while they viewed moving or stationary visual stimuli. For each of the three modalities separately (echo, source, vision), we then identified motion-sensitive areas in temporal-occipital cortex and in the planum temporale. We then used a region of interest (ROI) analysis to investigate cross-modal responses, as well as laterality effects. In both sighted novices and blind experts, we found that temporal-occipital source-motion ROIs did not respond to echo-motion, and echo-motion ROIs did not respond to sourcemotion. This double-dissociation was absent in planum temporale ROIs. Furthermore, temporal-occipital echo-motion ROIs in blind, but not sighted, participants showed evidence for contralateral motion preference. Temporal-occipital source-motion ROIs did not show evidence for contralateral preference in either blind or sighted participants. Our data suggest a functional segregation of processing of auditory source-motion and echo-motion in human temporal-occipital cortex. Furthermore, the data suggest that the echo-motion response in blind experts may represent a reorganization rather than exaggeration of response observed in sighted novices. There is the possibility that this reorganization involves the recruitment of "visual" cortical areas.fMRI; human; cortex; neuroplasticity; audition SOME PEOPLE, JUST LIKE CERTAIN bats and marine mammals, can echolocate by making mouth-clicks and listening to the returning echoes (Schenkman and Nilsson 2010;Stoffregen and Pittenger 1995;Teng and Whitney 2011). Echolocation can be learned by both blind and sighted people with normal hearing (Ammons et al. 1953;Teng and Whitney 2011;Worchel and Mauney 1951). Previous behavioral research has shown that people are sensitive to echo-motion (Rosenblum et al. 2000;Thaler et al. 2011), and in a previous functional magnetic resonance imaging (fMRI) study, we found that echoes from moving compared with stationary surfaces elicited an increase in activation in temporal-occipital cortex in blind echolocation experts (Thaler et al. 2011). Human temporal-occipital cortex harbors visual-motion area MTϩ, as...
. (2013) 'Shape-specic activation of occipital cortex in an early blind echolocation expert.', Neuropsychologia., 51 (5). pp. 938-949. Further information on publisher's website:https://doi.org/10.1016/j.neuropsychologia.2013.01.024Publisher's copyright statement: NOTICE: this is the author's version of a work that was accepted for publication in Neuropsychologia. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reected in this document. Changes may have been made to this work since it was submitted for publication. A denitive version was subsequently published in Neuropsychologia, 51, 5, April 201351, 5, April , 10.101651, 5, April /j.neuropsychologia.2013.01.024. Additional information:Use policyThe full-text may be used and/or reproduced, and given to third parties in any format or medium, without prior permission or charge, for personal research or study, educational, or not-for-prot purposes provided that:• a full bibliographic reference is made to the original source • a link is made to the metadata record in DRO • the full-text is not changed in any way The full-text must not be sold in any format or medium without the formal permission of the copyright holders.Please consult the full DRO policy for further details. AbstractWe have previously reported that an early-blind echolocating individual (EB) showed robust occipital activation when he identified distant, silent objects based on echoes from his tongue clicks (Thaler, Arnott & Goodale, 2011). In the present study we investigated the extent to which echolocation activation in EB's occipital cortex reflected general echolocation processing per se versus feature-specific processing. In the first experiment, echolocation audio sessions were captured with in-ear microphones in an anechoic chamber or hallway alcove as EB produced tongue clicks in front of a concave or flat object covered in aluminum foil or a cotton towel. All eight echolocation sessions (2 shapes x 2 surface materials x 2 environments) were then randomly presented to him during a sparse-temporal scanning fMRI session. et al., 2011). Specifically, when lying in a magnetic resonance imaging (MRI) machine and listening to binaural in-ear audio recordings of their pre-recorded echolocation mouth clicks sessions that included echo information, both participants were not only able to identify the silent objects present in the recordings, but their corresponding blood oxygen-level dependent (BOLD) activity was found to increase in auditory and occipital cortices. Most impressively, when this brain activity was contrasted with that related to listening to the same sounds but with the very faint echoes removed, activity in occipital but not auditory cortex remained. The results indicated that the processing of the echo information was being carried out in occipital cortex.In the present study we wished to further explore EB's echo-related brain activity Huttenlocher & de Courten...
Echolocation is the ability to use sound-echoes to infer spatial information about the environment. Some blind people have developed extraordinary proficiency in echolocation using mouth-clicks. The first step of human biosonar is the transmission (mouth click) and subsequent reception of the resultant sound through the ear. Existing head-related transfer function (HRTF) data bases provide descriptions of reception of the resultant sound. For the current report, we collected a large database of click emissions with three blind people expertly trained in echolocation, which allowed us to perform unprecedented analyses. Specifically, the current report provides the first ever description of the spatial distribution (i.e. beam pattern) of human expert echolocation transmissions, as well as spectro-temporal descriptions at a level of detail not available before. Our data show that transmission levels are fairly constant within a 60° cone emanating from the mouth, but levels drop gradually at further angles, more than for speech. In terms of spectro-temporal features, our data show that emissions are consistently very brief (~3ms duration) with peak frequencies 2-4kHz, but with energy also at 10kHz. This differs from previous reports of durations 3-15ms and peak frequencies 2-8kHz, which were based on less detailed measurements. Based on our measurements we propose to model transmissions as sum of monotones modulated by a decaying exponential, with angular attenuation by a modified cardioid. We provide model parameters for each echolocator. These results are a step towards developing computational models of human biosonar. For example, in bats, spatial and spectro-temporal features of emissions have been used to derive and test model based hypotheses about behaviour. The data we present here suggest similar research opportunities within the context of human echolocation. Relatedly, the data are a basis to develop synthetic models of human echolocation that could be virtual (i.e. simulated) or real (i.e. loudspeaker, microphones), and which will help understanding the link between physical principles and human behaviour.
People use sensory, in particular visual, information to guide actions such as walking around obstacles, grasping or reaching. However, it is presently unclear how malleable the sensorimotor system is. The present study investigated this by measuring how click-based echolocation may be used to avoid obstacles while walking. We tested 7 blind echolocation experts, 14 sighted, and 10 blind echolocation beginners. For comparison, we also tested 10 sighted participants, who used vision. To maximize the relevance of our research for people with vision impairments, we also included a condition where the long cane was used and considered obstacles at different elevations. Motion capture and sound data were acquired simultaneously. We found that echolocation experts walked just as fast as sighted participants using vision, and faster than either sighted or blind echolocation beginners. Walking paths of echolocation experts indicated early and smooth adjustments, similar to those shown by sighted people using vision and different from later and more abrupt adjustments of beginners. Further, for all participants, the use of echolocation significantly decreased collision frequency with obstacles at head, but not ground level. Further analyses showed that participants who made clicks with higher spectral frequency content walked faster, and that for experts higher clicking rates were associated with faster walking. The results highlight that people can use novel sensory information (here, echolocation) to guide actions, demonstrating the action system’s ability to adapt to changes in sensory input. They also highlight that regular use of echolocation enhances sensory-motor coordination for walking in blind people.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.