Animallike robot companions such as robotic seal Paro are increasingly used in dementia care due to the positive effects that interaction with these robots can have on the well-being of these patients. Touch is one of the most important interaction modalities for patients with dementia and can be a natural way to interact with animallike robots. To advance the development of animallike robots, we explored in what ways people with dementia could benefit from interaction with an animallike robot with more advanced touch recognition capabilities and which touch gestures would be important in their interaction with Paro. In addition, we explored which other target groups might benefit from interaction with animallike robots with more advanced interaction capabilities. In this study, we administered a questionnaire and conducted interviews with two groups of health-care providers who all worked in a geriatric psychiatry department. One group used Paro in their work (i.e., the expert group; n = 5) while the other group had no experience with the use of animallike robot (i.e., the layman group; n = 4). The results showed that health-care providers perceived Paro as an effective intervention to improve the well-being of people with dementia. Examples of usages for Paro that were mentioned were providing distraction, interrupting problematic behaviors, and stimulating communication. Furthermore, the care providers indicated that people with dementia (would) use mostly positive forms of touch and speech to interact with Paro. Paro's auditory responses were criticized because they can overstimulate the patients. In addition, the care providers argued that social interactions with Paro are currently limited and therefore the robot does not meet the needs of a broader audience such as healthy elderly people who still live in their own homes. The development of robot pets with more advanced social capabilities such as touch and speech recognition might result in more intelligent interactions, which could help to better adapt to the needs of people with dementia and could make interactions more interesting for a broader audience. Moreover, the robot's response modalities and its appearance should match the needs of to the target group.
Touch behavior is of great importance during social interaction. To transfer the tactile modality from interpersonal interaction to other areas such as Human-Robot Interaction (HRI) and remote communication automatic recognition of social touch is necessary. This paper introduces CoST: Corpus of Social Touch, a collection containing 7805 instances of 14 different social touch gestures. The gestures were performed in three variations: gentle, normal and rough, on a sensor grid wrapped around a mannequin arm. Recognition of the rough variations of these 14 gesture classes using Bayesian classifiers and Support Vector Machines (SVMs) resulted in an overall accuracy of 54% and 53%, respectively. Furthermore, this paper provides more insight into the challenges of automatic recognition of social touch gestures, including which gestures can be recognized more easily and which are more difficult to recognize.
For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human-robot interaction (HRI).
Touch is an important interaction modality in social interaction, for instance touch can communicate emotions and can intensify emotions communicated by other modalities. In this paper we explore the use of Neural Networks for the classification of touch. The exploration and assessment of Neural Networks (NNs) is based on the Corpus of Social Touch established by Jung et al. This corpus was split in a train set (65%) and test set (35%), the train set was used to find the optimal parameters for the NN and for training the final model. Also di↵erent feature sets were investigated; the basic feature set included in the corpus, energy-histogram and dynamical features. Using all features led to the best performance of 64% on the test set, using a NN consisting of one hidden layer with 46 neurones. The confusion matrix showed the expected high confusion between pat-tap and grab-squeeze. A leave-one-subject-out approach lead to a performance of 54%, which is comparable with the results of Jung et al.
Part 1: Fundamental IssuesInternational audienceIn this paper we outline the design and development of an embodied conversational agent setup that incorporates an augmented reality screen and tactile sleeve. With this setup the agent can visually and physically touch the user. We provide a literature overview of embodied conversational agents, as well as haptic technologies, and argue for the importance of adding touch to an embodied conversational agent. Finally, we provide guidelines for studies involving the touching virtual agent (TVA) setup
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.