Purpose: Sensory-substitution devices (SSDs) provide auditory or tactile representations of visual information. These devices often generate unpleasant sensations and mostly lack color information. We present here a novel SSD aimed at addressing these issues. Methods: We developed the EyeMusic, a novel visual-to-auditory SSD for the blind, providing both shape and color information. Our design uses musical notes on a pentatonic scale generated by natural instruments to convey the visual information in a pleasant manner. A short behavioral protocol was utilized to train the blind to extract shape and color information, and test their acquired abilities. Finally, we conducted a survey and a comparison task to assess the pleasantness of the generated auditory stimuli. Results: We show that basic shape and color information can be decoded from the generated auditory stimuli. High performance levels were achieved by all participants following as little as 2-3 hours of training. Furthermore, we show that users indeed found the stimuli pleasant and potentially tolerable for prolonged use. Conclusions: The novel EyeMusic algorithm provides an intuitive and relatively pleasant way for the blind to extract shape and color information. We suggest that this might help facilitating visual rehabilitation because of the added functionality and enhanced pleasantness.
Purpose: Independent mobility is one of the most pressing problems facing people who are blind. We present the EyeCane, a new mobility aid aimed at increasing perception of environment beyond what is provided by the traditional White Cane for tasks such as distance estimation, navigation and obstacle detection. Methods:The "EyeCane" enhances the traditional White Cane by using tactile and auditory output to increase detectable distance and angles. It circumvents the technical pitfalls of other devices, such as weight, short battery life, complex interface schemes, and slow learning curve. It implements multiple beams to enables detection of obstacles at different heights, and narrow beams to provide active sensing that can potentially increase the user's spatial perception of the environment. Participants were tasked with using the EyeCane for several basic tasks with minimal training. Results: Blind and blindfolded-sighted participants were able to use the EyeCane successfully for distance estimation, simple navigation and simple obstacle detection after only several minutes of training. Conclusions: These results demonstrate the EyeCane's potential for mobility rehabilitation. The short training time is especially important since available mobility training resources are limited, not always available, and can be quite expensive and/or entail long waiting periods.
Background Socially assistive robots (SARs) have been proposed as a tool to help individuals who have had a stroke to perform their exercise during their rehabilitation process. Yet, to date, there are no data on the motivating benefit of SARs in a long-term interaction with post-stroke patients. Methods Here, we describe a robot-based gamified exercise platform, which we developed for long-term post-stroke rehabilitation. The platform uses the humanoid robot Pepper, and also has a computer-based configuration (with no robot). It includes seven gamified sets of exercises, which are based on functional tasks from the everyday life of the patients. The platform gives the patients instructions, as well as feedback on their performance, and can track their performance over time. We performed a long-term patient-usability study, where 24 post-stroke patients were randomly allocated to exercise with this platform—either with the robot or the computer configuration—over a 5–7 week period, 3 times per week, for a total of 306 sessions. Results The participants in both groups reported that this rehabilitation platform addressed their arm rehabilitation needs, and they expressed their desire to continue training with it even after the study ended. We found a trend for higher acceptance of the system by the participants in the robot group on all parameters; however, this difference was not significant. We found that system failures did not affect the long-term trust that users felt towards the system. Conclusions We demonstrated the usability of using this platform for a long-term rehabilitation with post-stroke patients in a clinical setting. We found high levels of acceptance of both platform configurations by patients following this interaction, with higher ratings given to the SAR configuration. We show that it is not the mere use of technology that increases the motivation of the person to practice, but rather it is the appreciation of the technology’s effectiveness and its perceived contribution to the rehabilitation process. In addition, we provide a list of guidelines that can be used when designing and implementing other technological tools for rehabilitation. Trial registration: This trial is registered in the NIH ClinicalTrials.gov database. Registration number NCT03651063, registration date 21.08.2018. https://clinicaltrials.gov/ct2/show/NCT03651063.
Virtual worlds and environments are becoming an increasingly central part of our lives, yet they are still far from accessible to the blind. This is especially unfortunate as such environments hold great potential for them for uses such as social interaction, online education and especially for use with familiarizing the visually impaired user with a real environment virtually from the comfort and safety of his own home before visiting it in the real world. We have implemented a simple algorithm to improve this situation using single-point depth information, enabling the blind to use a virtual cane, modeled on the “EyeCane” electronic travel aid, within any virtual environment with minimal pre-processing. Use of the Virtual-EyeCane, enables this experience to potentially be later used in real world environments with identical stimuli to those from the virtual environment. We show the fast-learned practical use of this algorithm for navigation in simple environments.
Human-human social touch improves mood and alleviates pain. No studies have so far tested the effect of human-robot emotional touch on experimentally induced pain ratings, on mood and on oxytocin levels in healthy young adults. Here, we assessed the effect of touching the robot PARO on pain perception, on mood and on salivary oxytocin levels, in 83 young adults. We measured their perceived pain, happiness state, and salivary oxytocin. For the 63 participants in the PARO group, pain was assessed in three conditions: Baseline, Touch (touching PARO) and No-Touch (PARO present). The control group (20 participants) underwent the same measurements without ever encountering PARO. There was a decrease in pain ratings and in oxytocin levels and an increase in happiness ratings compared to baseline only in the PARO group. The Touch condition yielded a larger decrease in pain ratings compared to No-Touch. These effects correlated with the participants’ positive perceptions of the interaction with PARO. Participants with higher perceived ability to communicate with PARO experienced a greater hypoalgesic effect when touching PARO. We show that human-robot social touch is effective in reducing pain ratings, improving mood and - surprisingly - reducing salivary oxytocin levels in adults.
We tested 23 healthy participants who performed rhythmic horizontal movements of the elbow. The required amplitude and frequency ranges of the movements were specified to the participants using a closed shape on a phase-plane display, showing angular velocity versus angular position, such that participants had to continuously control both the speed and the displacement of their forearm. We found that the combined accuracy in velocity and position throughout the movement was not a monotonic function of movement speed. Our findings suggest that specific combinations of required movement frequency and amplitude give rise to two distinct types of movements: one of a more rhythmic nature, and the other of a more discrete nature.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.