2015
DOI: 10.1007/978-3-319-25554-5_36
|View full text |Cite
|
Sign up to set email alerts
|

Personalized Assistance for Dressing Users

Abstract: Abstract. In this paper, we present an approach for a robot to provide personalized assistance for dressing a user. In particular, given a dressing task, our approach finds a solution involving manipulator motions and also user repositioning requests. Specifically, the solution allows the robot and user to take turns moving in the same space and is cognizant of the user's limitations. To accomplish this, a vision module monitors the human's motion, determines if he is following the repositioning requests, and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(35 citation statements)
references
References 15 publications
0
33
0
Order By: Relevance
“…Assistive robots can use the vision information of a human body when dressing a user [4][5][6]. However, occlusions could occur when the robot's arms, the clothes and the human body are in close contact, which leads to human pose recognition failures.…”
Section: Introductionmentioning
confidence: 99%
“…Assistive robots can use the vision information of a human body when dressing a user [4][5][6]. However, occlusions could occur when the robot's arms, the clothes and the human body are in close contact, which leads to human pose recognition failures.…”
Section: Introductionmentioning
confidence: 99%
“…Without mobility, robots are restricted to a narrow set of tasks and are unable to leave the immediate vicinity of the human to provide assistance elsewhere. Recent studies have introduced generalpurpose mobile manipulators for various assistive robotic tasks, including shaving [9,39], dressing [41,42,43], fetch-and-carry [44,45,46,47], and guiding tasks [48]. Our meal-assistance system has a mobile base that has the potential to enhance the quality of feeding assistance.…”
Section: Assistive Manipulatorsmentioning
confidence: 99%
“…For example, Koganti et al [4] used RGB-D and motion capture data to estimate the topological relationship between a person's body and a garment. Klee et al [5] visually detected a person's pose which was used by a Baxter robot to assist in putting on a hat. Pignat et al [6] tracked a person's hand movement in real time using an AR tag.…”
Section: A Robot-assisted Dressing and Force Estimationmentioning
confidence: 99%
“…As a result, the robot would be unable to recognize or replan actions if the garment were to entirely miss a person's body. Future work could address this by incorporating other modalities, such as vision-based techniques, to estimate a person's pose before or during dressing [5], [24].…”
Section: A Full Arm Dressingmentioning
confidence: 99%