Some blind people have developed a unique technique, called echolocation, to orient themselves in unknown environments. More specifically, by self-generating a clicking noise with the tongue, echolocators gain knowledge about the external environment by perceiving more detailed object features. It is not clear to date whether sighted individuals can also develop such an extremely useful technique. To investigate this, here we test the ability of novice sighted participants to perform a depth echolocation task. Moreover, in order to evaluate whether the type of room (anechoic or reverberant) and the type of clicking sound (with the tongue or with the hands) influences the learning of this technique, we divided the entire sample into four groups. Half of the participants produced the clicking sound with their tongue, the other half with their hands. Half of the participants performed the task in an anechoic chamber, the other half in a reverberant room. Subjects stood in front of five bars, each of a different size, and at five different distances from the subject. The dimension of the bars ensured a constant subtended angle for the five distances considered. The task was to identify the correct distance of the bar. We found that, even by the second session, the participants were able to judge the correct depth of the bar at a rate greater than chance. Improvements in both precision and accuracy were observed in all experimental sessions. More interestingly, we found significantly better performance in the reverberant room than in the anechoic chamber. The type of clicking did not modulate our results. This suggests that the echolocation technique can also be learned by sighted individuals and that room reverberation can influence this learning process. More generally, this study shows that total loss of sight is not a prerequisite for echolocation skills this suggests important potential implications on rehabilitation settings for persons with residual vision.
Vision loss has severe impacts on physical, social and emotional well-being. The education of blind children poses issues as many scholar disciplines (e.g., geometry, mathematics) are normally taught by heavily relying on vision. Touch-based assistive technologies are potential tools to provide graphical contents to blind users, improving learning possibilities and social inclusion. Raised-lines drawings are still the golden standard, but stimuli cannot be reconfigured or adapted and the blind person constantly requires assistance. Although much research concerns technological development, little work concerned the assessment of programmable tactile graphics, in educative and rehabilitative contexts. Here we designed, on programmable tactile displays, tests aimed at assessing spatial memory skills and shapes recognition abilities. Tests involved a group of blind and a group of low vision children and adolescents in a four-week longitudinal schedule. After establishing subject-specific difficulty levels, we observed a significant enhancement of performance across sessions and for both groups. Learning effects were comparable to raised paper control tests: however, our setup required minimal external assistance. Overall, our results demonstrate that programmable maps are an effective way to display graphical contents in educative/rehabilitative contexts. They can be at least as effective as traditional paper tests yet providing superior flexibility and versatility.
Visual information is paramount to space perception. Vision influences auditory space estimation. Many studies show that simultaneous visual and auditory cues improve precision of the final multisensory estimate. However, the amount or the temporal extent of visual information, that is sufficient to influence auditory perception, is still unknown. It is therefore interesting to know if vision can improve auditory precision through a short-term environmental observation preceding the audio task and whether this influence is task-specific or environment-specific or both. To test these issues we investigate possible improvements of acoustic precision with sighted blindfolded participants in two audio tasks [minimum audible angle (MAA) and space bisection] and two acoustically different environments (normal room and anechoic room). With respect to a baseline of auditory precision, we found an improvement of precision in the space bisection task but not in the MAA after the observation of a normal room. No improvement was found when performing the same task in an anechoic chamber. In addition, no difference was found between a condition of short environment observation and a condition of full vision during the whole experimental session. Our results suggest that even short-term environmental observation can calibrate auditory spatial performance. They also suggest that echoes can be the cue that underpins visual calibration. Echoes may mediate the transfer of information from the visual to the auditory system.
The neural correlates of exploration and cognitive mapping in blindness remain elusive. The role of visuo-spatial pathways in blind vs. sighted subjects is still under debate. In this preliminary study, we investigate, as a possible estimation of the activity in the visuo-spatial pathways, the EEG patterns of blind and blindfolded-sighted subjects during the active tactile construction of cognitive maps from virtual objects compared with rest and passive tactile stimulation. Ten blind and ten matched, blindfolded-sighted subjects participated in the study. Events were defined as moments when the finger was only stimulated (passive stimulation) or the contour of a virtual object was touched (during active exploration). Event-related spectral power and coherence perturbations were evaluated within the beta 1 band (14–18 Hz). They were then related to a subjective cognitive-load estimation required by the explorations [namely, perceived levels of difficulty (PLD)]. We found complementary cues for sensory substitution and spatial processing in both groups: both blind and sighted subjects showed, while exploring, late power decreases and early power increases, potentially associated with motor programming and touch, respectively. The latter involved occipital areas only for blind subjects (long-term plasticity) and only during active exploration, thus supporting tactile-to-visual sensory substitution. In both groups, coherences emerged among the fronto-central, centro-parietal, and occipito-temporal derivations associated with visuo-spatial processing. This seems in accordance with mental map construction involving spatial processing, sensory-motor processing, and working memory. The observed involvement of the occipital regions suggests that a substitution process also occurs in sighted subjects. Only during explorations did coherence correlate positively with PLD for both groups and in derivations, which can be related to visuo-spatial processing, supporting the existence of supramodal spatial processing independently of vision capabilities.
Objective:To investigate whether training with tactile matrices displayed with a programmable tactile display improves recalling performance of spatial images in blind, low-vision and sighted youngsters. To code and understand the behavioral underpinnings of learning two-dimensional tactile dispositions, in terms of spontaneous exploration strategies.Methods:Three groups of blind, low-vision and sighted youngsters between 6 and 18 years old performed four training sessions with a weekly schedule in which they were asked to memorize single or double spatial layouts, featured as two-dimensional matrices.Results:Results showed that all groups of participants significantly improved their recall performance compared to the first session baseline in the single-matrix task. No statistical difference in performance between groups emerged in this task. Instead, the learning effect in visually impaired participants is reduced in the double-matrix task, whereas it is still robust in blindfolded sighted controls. We also coded tactile exploration strategies in both tasks and their correlation with performance. Sighted youngsters, in particular, favored a proprioceptive exploration strategy. Finally, performance in the double-matrix task negatively correlated with using one hand and positively correlated with a proprioceptive strategy.Conclusion:The results of our study indicate that blind persons do not easily process two separate spatial layouts. However, rehabilitation programs promoting bi-manual and proprioceptive approaches to tactile exploration might help improve spatial abilities. Finally, programmable tactile displays are an effective way to make spatial and graphical configurations accessible to visually impaired youngsters and they can be profitably exploited in rehabilitation.
We present a fully latching and scalable 4 × 4 haptic display with 4 mm pitch, 5 s refresh time, 400 mN holding force, and 650 μm displacement per taxel. The display serves to convey dynamic graphical information to blind and visually impaired users. Combining significant holding force with high taxel density and large amplitude motion in a very compact overall form factor was made possible by exploiting the reversible, fast, hundred-fold change in the stiffness of a thin shape memory polymer (SMP) membrane when heated above its glass transition temperature. Local heating is produced using an addressable array of 3 mm in diameter stretchable microheaters patterned on the SMP. Each taxel is selectively and independently actuated by synchronizing the local Joule heating with a single pressure supply. Switching off the heating locks each taxel into its position (up or down), enabling holding any array configuration with zero power consumption. A 3D-printed pin array is mounted over the SMP membrane, providing the user with a smooth and room temperature array of movable pins to explore by touch. Perception tests were carried out with 24 blind users resulting in 70 percent correct pattern recognition over a 12-word tactile dictionary.
In this work in progress we propose a new method for evaluating objectively the process of performing a tactile exploration with a visuo-tactile sensory substitution system. Both behavioral and neurophysiological cues are considered to evaluate the identification process of virtual objects and surrounding environments. Our experiments suggest that both sighted and visually impaired users integrated spatial information and developed similar behavioural and neurophysiological patterns. The proposed method could also serve as a tool to evaluate touch-based interfaces for application in orientation and mobility programs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.