Abstract-In this work, a Brain-Computer Interface (BCI) based on Steady-State Visual Evoked Potentials (SSVEP) is presented as an input device for the human machine interface (HMI) of the semi-autonomous robot FRIEND II. The role of the BCI is to translate high-level requests from the user into control commands for the FRIEND II system. In the current application, the BCI is used to navigate a menu system and to select commands such as pouring a beverage into a glass. The low-level control of the test platform, the rehabilitation robot FRIEND II, is executed by the control architecture MASSiVE, which in turn is served by a planning instance, an environment model and a set of sensors (e.g., machine vision) and actors. The BCI is introduced as a step towards the goal of providing disabled users with at least 1.5 hours independence from care givers.
SUMMARYIn this paper, a Brain–Computer Interface (BCI) control approach for the assistive robotic system FRIEND is presented. The objective of the robot is to assist elderly and persons with disabilities in their daily and professional life activities. FRIEND is presented here from an architectural point of view, that is, as an overall robotic device that includes many subareas of research, such as human–robot interaction, perception, object manipulation and path planning, robotic safety, and so on. The integration of the hardware and software components is described relative to the interconnections between the various elements of FRIEND and the approach used for human–machine interaction. Since the robotic system is intended to be used especially by patients suffering from a high degree of disability (e.g., patients which are quadriplegic, have muscle diseases or serious paralysis due to strokes, or any other diseases with similar consequences for their independence), an alternative non-invasive BCI has been investigated. The FRIEND–BCI paradigm is explained within the overall structure of the robot. The capabilities of the robotic system are demonstrated in three support scenarios, one that deals with Activities of daily living (ADL) and two that are taking place in a rehabilitation workshop. The proposed robot was clinically evaluated through different tests that directly measure task execution time and hardware performance, as well as the acceptance of robot by end-users.
SUMMARYThis paper presents an approach to reduce the technical complexity of a service robotic system by means of systematic and well-balanced user-involvement. By taking advantage of the user's cognitive capabilities during task execution, a technically manageable robotic system, which is able to execute tasks on a high level of abstraction reliably and robustly, emerges. For the realisation of this approach, the control architecture MASSiVE has been implemented, which is used for the control of the rehabilitation robot FRIEND II. It supports task execution on the basis of a priori defined and formally verified task-knowledge. This task-knowledge contains all possible sequences of operations as well as the symbolic representation of objects required for the execution of a specific task. The seamless integration of user interactions into this task-knowledge, in combination with MASSiVE's user-adapted human–machine interface layer, enables the system to deliberately interact with the user during run-time.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.