For robots to coexist with humans in a social world like ours, it is crucial that they possess human-like social interaction skills. Programming a robot to possess such skills is a challenging task. In this paper, we propose a Multimodal Deep Q-Network (MDQN) to enable a robot to learn human-like interaction skills through a trial and error method. This paper aims to develop a robot that gathers data during its interaction with a human, and learns human interaction behavior from the high dimensional sensory information using end-to-end reinforcement learning. This paper demonstrates that the robot was able to learn basic interaction skills successfully, after 14 days of interacting with people.
This paper presents a new research platform, CB 2 , a child robot with biomimetic body for cognitive developmental robotics [1] developed by the Socially-Synergistic Intelligence (Hereafter, Socio-SI) group of JST ERATO Asada Project. The Socio-SI group has focused on the design principles of communicative and intelligent machines and human social development through building a humanoid robot that has physical and perceptual structures close to us, that enables safe and close interactions with humans. For this purpose, CB 2 was designed, especially in order to establish and maintain a long-term social interaction between human and robot. The most significant features of CB 2 are a whole-body soft skin (silicon surface with many tactile sensors underneath) and flexible joints (51 pneumatic actuators). The fundamental capabilities and the preliminary experiments are shown, and the future work is discussed.
The feasibility and preliminary efficacy of an android robot-mediated mock job interview training in terms of both bolstering self-confidence and reducing biological levels of stress in comparison to a psycho-educational approach human interview was assessed in a randomized study. Young adults (ages 18–25 years) with autism spectrum disorder (ASD) were randomized to participate either in a mock job interview training with our android robot system (n = 7) or a self-paced review of materials about job-interviewing skills (n = 8). Baseline and outcome measurements of self-reported performance/efficacy and salivary cortisol were obtained after a mock job interview with a human interviewer. After training sessions, individuals with ASD participating in the android robot-mediated sessions reported marginally improved self-confidence and demonstrated significantly lower levels of salivary cortisol as compared to the control condition. These results provide preliminary support for the feasibility and efficacy of android robot-mediated learning.
Research suggests that many individuals with autism spectrum disorder (ASD) often demonstrate challenges providing appropriate levels of information during conversational interchanges. Considering the preference of individuals with ASD, and recent rapid technological advances, robotic systems may yield promise in promoting certain aspects of conversation and interaction such as self-disclosure of appropriate personal information. In the current work, we evaluated personal disclosures of events with specific emotional content across two differing robotic systems (android and simplistic humanoid) and human interactions. Nineteen participants were enrolled in this study: 11 (2 women and 9 men) adolescents with ASD and 8 (4 women and 4 men) adolescents with TD. Each participant completed a sequence of three interactions in a random order. Results indicated differences regarding comfort level and length of disclosures between adolescents with ASD and typically developing (TD) controls in relation to system interactions. Specifically, adolescents with ASD showed a preference for interacting with the robotic systems compared to TD controls and demonstrated lengthier disclosures when interacting with the visually simple humanoid robot compared to interacting with human interviewer. The findings suggest that robotic systems may be useful in eliciting and promoting aspects of social communication such as self-disclosure for some individuals with ASD.
Job interviews are significant barriers for individuals with autism spectrum disorder because these individuals lack good nonverbal communication skills. We developed a job interview training program using an android robot. The job interview training program using an android robot consists the following three stages: (1) tele-operating an android robot and conversing with others through the android robot, (2) a face-to-face mock job interview with the android robot, and (3) feedback based on the mock job interview and nonverbal communication exercises using the android robot. The participants were randomly assigned to the following two groups: one group received a combined intervention with “interview guidance by teachers and job interview training program using an android robot” ( n = 13), and the other group received an intervention with interview guidance by teachers alone ( n = 16). Before and after the intervention, the participants in both groups underwent a mock job interview with a human interviewer, who provided outcome measurements of nonverbal communication, self-confidence, and salivary cortisol. After the training sessions, the participants who received the combined interview guidance by teachers and the job interview training program using an android robot intervention displayed improved nonverbal communication skills and self-confidence and had significantly lower levels of salivary cortisol than the participants who only received interview guidance by teachers. The job interview training program using an android robot improved various measures of job interview skills in individuals with autism spectrum disorder.
Social robots are being increasingly employed in service encounters at hotels. This study explored the possibility that social robots can engage in heartwarming interactions with hotel customers. A collaboration design known as 'Continuous Hospitality with Social Robots' , in which social robots compensate for gaps in hospitality through heartwarming interaction, was evaluated. A field test was conducted in which social robots engaged in heartwarming interaction with customers in a public area of a hotel and then collected customers' impressions of the social robots and overall service via a questionnaire and an interview. The results demonstrate social robots' potential for engaging in heartwarming interactions that enhance overall customer satisfaction through the use of the 'Continuous Hospitality with Social Robots' collaboration design. An exploratory analysis suggests that the perceived impressions of the interaction with social robots are influenced by customer gender and the duration of interactions. Furthermore, the results suggest that social robots could be utilized in other roles at hotels, namely effective advertisement through heartwarming interaction and mental support for employees who do not interact with customers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.