2014 IEEE/RSJ International Conference on Intelligent Robots and Systems 2014
DOI: 10.1109/iros.2014.6942733
|View full text |Cite
|
Sign up to set email alerts
|

Determining proper grasp configurations for handovers through observation of object movement patterns and inter-object interactions during usage

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
18
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 19 publications
(18 citation statements)
references
References 16 publications
0
18
0
Order By: Relevance
“…) grasping planar faces of object M SELF S 9 (Carvalho & Nolfi, 2016) traversability depth, haptic M SELF S 10 (Castellini et al, 2011) grasping SIFT BoW, contact joints M S B 11 (Çelikkanat et al, 2015) pushing, grasping, throwing, shaking depth, haptic, proprioceptive and audio M SEMI RR 12 (Chan et al, 2014) grasping pose, action-object relation M U RR 13 (Chang, 2015) cutting, painting edges, TSSC N S RR 14 (Chen et al, 2015) traversability RGB images, motor controls M S S 15 (Chu et al, 2016a) (Sinapov & Stoytchev, 2007) pulling, dragging changes in raw pixels M SELF S 113 (Sinapov & Stoytchev, 2008) pulling, dragging raw pixels, trajectories M SELF S 114 (Song et al, 2016) …”
mentioning
confidence: 99%
“…) grasping planar faces of object M SELF S 9 (Carvalho & Nolfi, 2016) traversability depth, haptic M SELF S 10 (Castellini et al, 2011) grasping SIFT BoW, contact joints M S B 11 (Çelikkanat et al, 2015) pushing, grasping, throwing, shaking depth, haptic, proprioceptive and audio M SEMI RR 12 (Chan et al, 2014) grasping pose, action-object relation M U RR 13 (Chang, 2015) cutting, painting edges, TSSC N S RR 14 (Chen et al, 2015) traversability RGB images, motor controls M S S 15 (Chu et al, 2016a) (Sinapov & Stoytchev, 2007) pulling, dragging changes in raw pixels M SELF S 113 (Sinapov & Stoytchev, 2008) pulling, dragging raw pixels, trajectories M SELF S 114 (Song et al, 2016) …”
mentioning
confidence: 99%
“…A key challenge is choosing parameters of the robot's actions to optimize for a fluent handover. This includes the choice of object pose and robot's grasp on the object, taking into account user comfort [17], preferences based on subjective feedback [18], affordances and intended use of the objects after the handover [19], [20], [21], [22], [23], motion constraints of the human [13], social role of the human [24], and configuration of the object when being grasped before the handover [25]. Other work emphasizes parameters of the trajectory to reach the handover pose, exploring the approach angle [11], starting pose of trajectory in contrast to the handover pose [15], motion smoothness [26], object release time [27], estimated human wrist pose [28], [29], relative timing of handover phases [30], and ergonomic preferences of humans [31].…”
Section: Related Workmentioning
confidence: 99%
“…Typical examples here can be found in Human-Robot Interaction research; for instance, robots that learn to recognize gestures and anticipate human actions (Saponaro et al, 2013 ; Jiang and Saxena, 2014 ). In Chan et al ( 2014 ), a robot learns proper grasp configurations for object handover by observing humans using tools such as knives and screwdrivers, in Shu et al ( 2017 ), a robot learns social affordances e.g., human-like behaviors, in human-robot interaction scenarios such as waving, shaking hands .…”
Section: Affordance Learning and Perceptionmentioning
confidence: 99%