Background Shedding light on the neuroscientific mechanisms of human upper limb motor control, in both healthy and disease conditions (e.g., after a stroke), can help to devise effective tools for a quantitative evaluation of the impaired conditions, and to properly inform the rehabilitative process. Furthermore, the design and control of mechatronic devices can also benefit from such neuroscientific outcomes, with important implications for assistive and rehabilitation robotics and advanced human-machine interaction. To reach these goals, we believe that an exhaustive data collection on human behavior is a mandatory step. For this reason, we release U-Limb, a large, multi-modal, multi-center data collection on human upper limb movements, with the aim of fostering trans-disciplinary cross-fertilization. Contribution This collection of signals consists of data from 91 able-bodied and 65 post-stroke participants and is organized at 3 levels: (i) upper limb daily living activities, during which kinematic and physiological signals (electromyography, electro-encephalography, and electrocardiography) were recorded; (ii) force-kinematic behavior during precise manipulation tasks with a haptic device; and (iii) brain activity during hand control using functional magnetic resonance imaging.
Recently, in the attempt to increase blind people autonomy and improve their quality of life, a lot of effort has been devoted to develop technological travel aids. These systems can surrogate spatial information about the environment and deliver it to end-users through sensory substitution (auditory, haptic). However, despite the promising research outcomes, these solutions have met scarce acceptance in real-world. Often, this is also due to the limited involvement of real end users in the conceptual and design phases. In this manuscript, we propose a novel indoor navigation system based on wearable haptic technologies. All the developmental phases were driven by continuous feedback from visually impaired persons. The proposed travel aid system consists of a RGB-D camera, a processing unit to compute visual information for obstacle avoidance, and a wearable device, which can provide normal and tangential force cues for guidance in an unknown indoor environment. Experiments with blindfolded subjects and visually impaired participants show that our system could be an effective support during indoor navigation, and a viable tool for training blind people to the usage of travel aids.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.