Socially assistive robots have the potential to become a powerful therapeutic tool for individuals affected by the autism-spectrum condition (ASC). However, to date, only a few studies investigated the efficacy of robot-assisted training in structured protocols. The current study aimed at investigating the beneficial effects of a robot-assisted training embedded in the treatment plan provided by an Italian healthcare institution. In collaboration with the healthcare professionals of Piccolo Cottolengo Genovese di Don Orione, we designed a robot-mediated activity aimed at improving social skills in children with ASC Twenty-four ASC kids (Age = 5.79 ± 1.02, 5 females) completed the activities with the robot in a cross-over design, during a period of ten weeks. Their social skills were assessed before and after the robot intervention activities, using the Early Social Communication Scale (ESCS). Results showed that the combination of robot-assisted training and the standard therapy was more effective than the standard therapy alone, in terms of improvement of social skills. Specifically, after the robot-assisted training, children with ASC seem to improve their abilities to generate and respond to behavioral requests and in their tendency to initiate and maintain social interaction with the adult. Our results support the idea that robot-assisted interventions can be combined with the standard treatment plan to improve clinical outcomes.
This article reviews methods to investigate joint attention and highlights the benefits of new methodological approaches that make use of the most recent technological developments, such as humanoid robots for studying social cognition. After reviewing classical approaches that address joint attention mechanisms with the use of controlled screen-based stimuli, we describe recent accounts that have proposed the need for more natural and interactive experimental protocols. Although the recent approaches allow for more ecological validity, they often face the challenges of experimental control in more natural social interaction protocols. In this context, we propose that the use of humanoid robots in interactive protocols is a particularly promising avenue for targeting the mechanisms of joint attention. Using humanoid robots to interact with humans in naturalistic experimental setups has the advantage of both excellent experimental control and ecological validity. In clinical applications, it offers new techniques for both diagnosis and therapy, especially for children with autism spectrum disorder. The review concludes with indications for future research, in the domains of healthcare applications and human-robot interaction in general.
Virtual guiding fixtures constrain the movements of a robot to task-relevant trajectories, and have been successfully applied to, for instance, surgical and manufacturing tasks. Whereas previous work has considered guiding fixtures for single tasks, in this paper we propose a library of guiding fixtures for multiple tasks, and propose methods for 1) Creating and adding guides based on machine learning; 2) Selecting guides on-line based on probabilistic implementation of guiding fixtures; 3) Refining existing guides based on an incremental learning method. We demonstrate in an industrial task that a library of guiding fixtures provides an intuitive haptic interface for joint human-robot completion of tasks, and improves performance in terms of task execution time, mental workload and errors.
We design a personalized human-robot environment for social learning for individuals with autism spectrum disorders (ASD). In order to define an individual's profile, we posit that the individual's reliance on proprioceptive and kinematic visual cues should affect the way the individual suffering from ASD interacts with a social agent (human/robot/virtual agent). In this paper, we assess the potential link between recognition performances of body/facial expressions of emotion of increasing complexity, emotion recognition on platforms with different visual features (two mini-humanoid robots, a virtual agent, and a This is one of several papers published in Autonomous Robots comprising the "Special Issue on Assistive and Rehabilitation human), and proprioceptive and visual cues integration of an individual. First, we describe the design of the EMBODI-EMO database containing videos of controlled body/facial expressions of emotions from various platforms. We explain how we validated this database with typically developed (TD) individuals. Then, we investigate the relationship between emotion recognition and proprioceptive and visual profiles of TD individuals and individuals with ASD. For TD individuals, our results indicate a relationship between profiles and emotion recognition. As expected, we show that TD individuals that rely more heavily on visual cues yield better recognition scores. However, we found that TD individuals relying on proprioception have better recognition scores, going against our hypothesis. Finally, participants with ASD relying more heavily on proprioceptive cues have lower emotion recognition scores on all conditions than participants relying on visual cues.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.