Smart environments offer valuable technologies for activity monitoring and health assessment. Here, we describe an integration of robots into smart environments to provide more interactive support of individuals with functional limitations. RAS, our Robot Activity Support system, partners smart environment sensing, object detection and mapping, and robot interaction to detect and assist with activity errors that may occur in everyday settings. We describe the components of the RAS system and demonstrate its use in a smart home testbed. To evaluate the usability of RAS, we also collected and analyzed feedback from participants who received assistance from RAS in a smart home setting as they performed routine activities.
This article introduces RAS, a cyber-physical system that supports individuals with memory limitations to perform daily activities in their own homes. RAS represents a partnership between a smart home, a robot, and software agents. When smart home residents perform activities, RAS senses their movement in the space and identifies the current activity. RAS tracks activity steps to detect omission errors. When an error is detected, the RAS robot finds and approaches the human with an offer of assistance. Assistance consists of playing a video recording of the entire activity, showing the omitted activity step, or guiding the resident to the object that is required for the current step. We evaluated RAS performance for 54 participants performing three scripted activities in a smart home testbed and for 2 participants using the system over multiple days in their own homes. In the testbed experiment, activity errors were detected with a sensitivity of 0.955 and specificity of 0.992. RAS assistance was performed successfully with a rate of 0.600. In the in-home experiments, activity errors were detected with a combined sensitivity of 0.905 and a combined specificity of 0.988. RAS assistance was performed successfully for the in-home experiments with a rate of 0.830.
Background: Older adults may require assistance completing activities of daily living (ADLs). Robotic assistance can offset healthcare costs and allow older adults to preserve their autonomy. Younger adults are often involved in the design and purchase of these robotic technologies and must take into account the needs and expectations of the target population (i.e., older adults) to create a robotic system that they will adopt. Objective: This study evaluated the opinions of younger and older adults regarding the design and performance of the Robot Activity Support (RAS) system. It is important to understand points of agreement and divergence between these populations' perspectives so that effective robotic aids are created for older adults. Methods: Fifty-two younger and older adults completed three scripted tasks with the RAS system in a smart home environment. Each participant made task errors to cue the robot to offer help via three prompt modalities (guide to object, video of forgotten step, and video of a full task). After interacting with the RAS system, participants completed questionnaires to report opinions of and satisfaction with the robot. Results: There were minimal differences between younger and older adults' perceptions of the RAS system across multiple factors (e.g., likability, cognitive demand), with both groups expressing generally neutral opinions. Both groups rated the Full Video prompt as least helpful, effective, and liked. Participants recommended the robotic system's response accuracy, movement speed, alerting style and system flexibility be improved. Younger adults overestimated how much older adults would want a robot like this. Conclusions: This study underscores the importance of testing technology with target populations, as older adults were less interested in having RAS or a similar robot in their home than younger counterparts expected. Future work with robotic aids should focus first on older adults' requirements for an adoptable product, and then on optimal design to increase its usability.
Algorithms for automated novelty detection and management are of growing interest but must address the inherent uncertainty from variations in non-novel environments while detecting the changes from the novelty. This paper expands on a recent unified framework to develop an operational theory for novelty that includes multiple (sub)types of novelty. As an example, this paper explores the problem of multi-type novelty detection in a 3D version of CartPole, wherein the cart Weibull-Open-World control-agent (WOW-agent) is confronted by different sub-types/levels of novelty from multiple independent agents moving in the environment. The WOW-agent must balance the pole and detect and characterize the novelties while adapting to maintain that balance. The approach develops static, dynamic, and prediction-error measures of dissimilarity to address different signals/sources of novelty. The WOW-agent uses the Extreme Value Theory, applied per dimension of the dissimilarity measures, to detect outliers and combines different dimensions to characterize the novelty. In blind/sequestered testing, the system detects nearly 100% of the non-nuisance novelties, detects many nuisance novelties, and shows it is better than novelty detection using a Gaussian-based approach. We also show the WOW-agent’s lookahead collision avoiding control is significantly better than a baseline Deep-Q-learning Networktrained controller.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.