2019
DOI: 10.1080/01691864.2019.1636715
|View full text |Cite
|
Sign up to set email alerts
|

A framework for robotic clothing assistance by imitation learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(14 citation statements)
references
References 9 publications
0
14
0
Order By: Relevance
“…Similarly, modeling of human motion uncertainty has only been performed for collaborative tasks. The works of Yamazaki et al (2014) , Gao et al (2015) , Yu et al (2017) , Joshi et al (2019) and Koganti et al (2019) model this uncertainty in close-proximity collaborative tasks in specific scenarios in which either a global trajectory is learned or motor skills are encoded. However, the human movement modeled in these studies are for general skills to perform a task without any considerations of disrupted human movement.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Similarly, modeling of human motion uncertainty has only been performed for collaborative tasks. The works of Yamazaki et al (2014) , Gao et al (2015) , Yu et al (2017) , Joshi et al (2019) and Koganti et al (2019) model this uncertainty in close-proximity collaborative tasks in specific scenarios in which either a global trajectory is learned or motor skills are encoded. However, the human movement modeled in these studies are for general skills to perform a task without any considerations of disrupted human movement.…”
Section: Related Workmentioning
confidence: 99%
“…This methodology takes advantage of visualization to spot disruption within the human movement as a change in the collaborative behavior. Latent variable models ( Orrite-Urunuela et al, 2004 ) are used to address these challenges ( Zhang et al, 2017 ; Joshi et al, 2019 ) to model limitations in human movement or to personalize human movements. This paper, will use a related approach to model the human movement and highlight any disruptions associated with the change in collaborative behavior.…”
Section: Related Workmentioning
confidence: 99%
“…Besides normal gait, DMPs were also applied for stairascend (Xu et al 2020) and sit-to-stand (Kamali et al 2016) assistive movements of lower-body exoskeletons. In (Joshi et al 2019), a robotic arm was used to assist humans with putting the cloths on their body, where the movements were generated by DMPs.…”
Section: Human Assistance Augmentation and Rehabilitationmentioning
confidence: 99%
“…One common pose estimation method during physical interaction is using visual features from an RGB or depth camera. For example, several works have used depth sensing for estimating human pose [68], [69], [70], [71] and tracking cloth features [72], [73], [74] while a robot helps a person or mannequin dress on a clothing garment. Jiménez et al [75] provide an overview of various perception techniques for tracking cloth during assistive robotic tasks.…”
Section: Human Pose Estimation For Physical Assistancementioning
confidence: 99%