There is an increasing interest in non-visual interfaces for HCI to take advantage of the information processing capability of the other sensory modalities. The BrainPort is a vision-to-tactile sensory substitution device that conveys information through electro-stimulation on the tongue. As the tongue is a horizontal surface, it makes for an interesting platform to study the brain's representation of space. But which way is up on the tongue? We provided participants with perceptually ambiguous stimuli and measured how often different perspectives were adopted; furthermore, whether camera orientation and gender had an effect. Additionally, we examined whether personality (trait extraversion and openness) could predict the perspective taken. We found that self-centered perspectives were predominantly adopted, and that trait openness may predict perspective. This research demonstrates how individual differences can affect the usability of sensory substitution devices, and highlights the need for flexible and customisable interfaces.
Sensory substitution devices (SSDs) can convey visuospatial information through spatialised auditory or tactile stimulation using wearable technology. However, the level of information loss associated with this transformation is unknown. In this study, novice users discriminated the location of two objects at 1.2 m using devices that transformed a 16 Â 8-depth map into spatially
The landscape of digital games is segregated by player ability. For example, sighted players have a multitude of highly visual games at their disposal, while blind players may choose from a variety of audio games. Attempts at improving cross-ability access to any of those are often limited in the experience they provide, or disregard multiplayer experiences. We explore ability-based asymmetric roles as a design approach to create engaging and challenging mixedability play. Our team designed and developed two collaborative testbed games exploring asymmetric interdependent roles. In a remote study with 13 mixed-visual-ability pairs we assessed how roles afected perceptions of engagement, competence, and autonomy, using a mixed-methods approach. The games provided an engaging and challenging experience, in which diferences in visual
People with blindness and visual impairments have reduced access to exercise compared to the general population during typical societal functioning. The Coronavirus-19 pandemic completely disrupted daily life for most individuals worldwide, and in the United Kingdom, a stay-at-home order was enforced. One of the sole reasons an individual could leave their home was for the purpose of daily exercise. Here, we examined how the UK national lockdown impacted access to exercise for people with blindness and visual impairment. We used a mixed methods design, collecting quantitative data from two established measures (the Exercise Barriers and Benefits Scale and the International Physical Activity Questionnaire), and qualitative data from open-ended questions. We found that, during the initial stages of the lockdown, perceived barriers to exercise increased compared to pre-pandemic levels, driven by factors, such as the closure of exercise facilities and additional difficulties posed by social distancing. Interestingly, during the later stages of the UK Coronavirus-19 response, perceived barriers decreased to lower than pre-pandemic levels. Thematic analysis indicated that this may have been due to participants finding new online methods to exercise at home, in combination with the tentative reopening of facilities.
We sought to understand how the perception of personal space is influenced by different levels of social density, spatial density, and type of view impact in South Korea and in the United Kingdom. We employed virtual reality (VR) technology to simulate shared and single occupancy offices and tested a sample of 20 British and 24 Korean participants. Uniquely, we obtained personal space estimations using a virtual disc around the participant which could be extended and retracted to indicate perceived amount of personal space. A more traditional personal space satisfaction score was also determined. We found that in both cultures participants experienced greater perceived personal space 1) when in a sparse rather than dense office and 2) having a view of the city outside the office. Both British and Korean participants had higher personal space satisfaction in the single occupancy office than in shared offices. However, British, but not Korean, participants had significantly higher personal space estimations in single occupancy offices than in shared offices. These results suggest that there is some disparity between abstract scores of personal space satisfaction and more concrete personal space estimates; further, this may be linked to subtle cross-cultural differences in workplace experience.
The tongue is an incredibly complex sensory organ, yet little is known about its tactile capacities compared to the hands. In particular, the tongue receives almost no visual input during development and so may be calibrated differently compared to other tactile senses for spatial tasks. Using a cueing task, via an electro-tactile display, we examined how a tactile cue (to the tongue) or an auditory cue can affect the orientation of attention to electro-tactile targets presented to one of four regions on the tongue. We observed that response accuracy was generally low for the same modality condition, especially at the back of the tongue. This implies that spatial localization ability is diminished either because the tongue is less calibrated by the visual modality or because of its position and orientation inside the body. However, when cues were provided cross-modally, target identification at the back of the tongue seemed to improve. Our findings suggest that, while the brain relies on a general mechanism for spatial (and tactile) attention, the surface of the tongue may not have clear access to these representations of space when solely provided via electro-tactile feedback but can be directed by other sensory modalities. Public Significance StatementThis study suggests that, while the tongue is an incredibly sensitive sensory organ to touch sensations, it does not process tactile attention in a uniform way. This has some implications for accessibility devices that use the tongue as a method for interacting with technology. Very little is known about the touch capabilities of the tongue, and these results begin to explore the tongue's attentional processing on its surface.
We sought to understand how the perception of personal space is influenced by different levels of social density, spatial density, and type of window-view in South Korean and United Kingdom workplaces. We employed virtual reality to simulate shared and single occupancy offices. We obtained personal space estimations using a virtual disc around the participant which could be extended and retracted, inside the simulation, to indicate perceived amount of personal space, and compared this measure to questionnaire-based estimations. We found that in both cultures participants experienced greater perceived personal space (1) when in a sparse rather than dense office and (2) having a view of the city outside the office. However, British, but not Korean, participants had significantly higher personal space estimations in single occupancy offices than in shared offices. These results suggest subtle cross-cultural differences in workplace experience, that could only be investigated using virtual reality.
The benefits of taking part in adventurous activities are many; particularly, for people with visual impairments. Sports such as rock climbing can improve feelings of skillfulness, autonomy, and confidence for people with low or no vision as they strive to overcome environmental and personal challenges. In this late-breaking work we present Climb-o-Vision, a novel sensory substitution software that utilizes YOLOv5 computer vision object-detection architecture, to aid navigation for rock climbers with visual impairments. Climb-o-Vision uses commercially available and cost-effective hardware to detect, track, and convert climbing hold spatial locations on to the surface of the tongue, via an electrotactile tongue interface. Preliminary testing of the device highlights the possibility of using sensory substitution as a sporting aid for people with visual impairments. Furthermore, it demonstrates the potential for adapting and improving current sensory substitution systems by employing computer vision techniques to filter useful task-specific information to users with visual impairments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.