This letter introduces a new inductive tongue computer interface to be used by disabled people for environmental control. The interface demands little effort from the user, provides a basis for an invisible interface, and has potential to allow a large number of commands to be facilitated.
BackgroundFor an individual with tetraplegia assistive robotic arms provide a potentially invaluable opportunity for rehabilitation. However, there is a lack of available control methods to allow these individuals to fully control the assistive arms.MethodsHere we show that it is possible for an individual with tetraplegia to use the tongue to fully control all 14 movements of an assistive robotic arm in a three dimensional space using a wireless intraoral control system, thus allowing for numerous activities of daily living. We developed a tongue-based robotic control method incorporating a multi-sensor inductive tongue interface. One abled-bodied individual and one individual with tetraplegia performed a proof of concept study by controlling the robot with their tongue using direct actuator control and endpoint control, respectively.ResultsAfter 30 min of training, the able-bodied experimental participant tongue controlled the assistive robot to pick up a roll of tape in 80% of the attempts. Further, the individual with tetraplegia succeeded in fully tongue controlling the assistive robot to reach for and touch a roll of tape in 100% of the attempts and to pick up the roll in 50% of the attempts. Furthermore, she controlled the robot to grasp a bottle of water and pour its contents into a cup; her first functional action in 19 years.ConclusionTo our knowledge, this is the first time that an individual with tetraplegia has been able to fully control an assistive robotic arm using a wireless intraoral tongue interface. The tongue interface used to control the robot is currently available for control of computers and of powered wheelchairs, and the robot employed in this study is also commercially available. Therefore, the presented results may translate into available solutions within reasonable time.
Alternative and effective methods for controlling powered wheelchairs are important to individuals with tetraplegia and similar impairments whom are unable to use the standard joystick. This paper describes a system where tongue movements are used to control a powered wheelchair thus providing users, with high level spinal cord injuries, full control of their wheelchair. The system is based on an inductive tongue control system developed at Center for Sensory-Motor Interaction (SMI), Aalborg University. The system emulates a standard analog joystick in order to interface the wheelchair, thus ensuring that the system works with almost any wheelchair. The total embedment of the tongue interface into the mouth makes the control practically invisible. A fuzzy system combining 8 sensors for directional control allows for multidirectional control of the wheelchair. Preliminary test results show navigation abilities, which are highly competitive when compared to other tongue control system.
For transhumeral and especially bilateral amputees, the ITCS control scheme could have a significant impact on the prosthesis control. In addition, the ITCS would provide bilateral amputees with the additional advantage of environmental and computer control for which the ITCS was originally developed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.