2019
DOI: 10.1109/tro.2019.2904461
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic Real-Time User Posture Tracking for Personalized Robot-Assisted Dressing

Abstract: Robotic solutions to dressing assistance have the potential to provide tremendous support for elderly and disabled people. However, unexpected user movements may lead to dressing failures or even pose a risk to the user. Tracking such user movements with vision sensors is challenging due to severe visual occlusions created by the robot and clothes. We propose a probabilistic tracking method using Bayesian networks in latent spaces, which fuses robot end-effector positions and force information to enable camera… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
36
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 52 publications
(41 citation statements)
references
References 63 publications
0
36
0
Order By: Relevance
“…Force information has been considered in [17] to adjust the robot motions to minimize the interaction force between the robot and the user. User movements have been tracked during the dressing process to plan the dressing trajectory in real-time [8,18]. While these different studies have led to promising results of dressing, they usually simplify the setup of their experiments by positioning the robot end-effector, with the garment manually attached on, close to the user's arm.…”
Section: Related Work a Robot-assisted Dressingmentioning
confidence: 99%
See 1 more Smart Citation
“…Force information has been considered in [17] to adjust the robot motions to minimize the interaction force between the robot and the user. User movements have been tracked during the dressing process to plan the dressing trajectory in real-time [8,18]. While these different studies have led to promising results of dressing, they usually simplify the setup of their experiments by positioning the robot end-effector, with the garment manually attached on, close to the user's arm.…”
Section: Related Work a Robot-assisted Dressingmentioning
confidence: 99%
“…Recent studies on robot-assisted dressing usually simplify the setup of the initial robot configuration before dressing sequences by manually attaching the garments on the robot end-effector [3][4][5][6][7][8]. Although these works successfully enable the robots to provide users with dressing assistance in putting on various types of garments, their experiments are usually initiated with the user's arm already in or close to the sleeve opening.…”
Section: Introductionmentioning
confidence: 99%
“…User modeling seems to be a key aspect to address their needs, and to understand their limitations through a personalized interaction, as depicted by some publications ( García-Soler et al, 2018 ). For instance, user modeling studies involving kinematic evaluation tests can determine the user safe workspace in the robot’s movement envelope ( Zhang et al, 2019 ). Similarly, an estimation of the range of physical forces exerted by a modeled arm during a simulated assisted dressing task was performed by Erickson et al (2017) .…”
Section: Literature Reviewmentioning
confidence: 99%
“…The robot was able to update the dressing trajectory based on a user-specific movement model whenever the user suddenly moved their arm for a secondary task. Later Zhang [35] combined the hierarchical multi-task control with a probabilistic filtering method to estimate user postures when the user performed unexpected movements such as pulling or pushing. Gao et al [7] used vision data to model the movement space of the human upper-body joints for each user to enable more natural postures of users in the dressing assistance.…”
Section: B Assistive Dressing Robotsmentioning
confidence: 99%