2020
DOI: 10.48550/arxiv.2003.06000
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Human Grasp Classification for Reactive Human-to-Robot Handovers

Abstract: Transfer of objects between humans and robots is a critical capability for collaborative robots. Although there has been a recent surge of interest in human-robot handovers, most prior research focus on robot-to-human handovers. Further, work on the equally critical human-to-robot handovers often assumes humans can place the object in the robot's gripper. In this paper, we propose an approach for human-to-robot handovers in which the robot meets the human halfway, by classifying the human's grasp of the object… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(12 citation statements)
references
References 57 publications
0
12
0
Order By: Relevance
“…In contrast, our work provides a full pipeline for human-to-robot handovers. Our work shows some similarities to [13]. However, the adopted perception and grasp selection approaches are completely different.…”
Section: Related Workmentioning
confidence: 79%
See 1 more Smart Citation
“…In contrast, our work provides a full pipeline for human-to-robot handovers. Our work shows some similarities to [13]. However, the adopted perception and grasp selection approaches are completely different.…”
Section: Related Workmentioning
confidence: 79%
“…However, the adopted perception and grasp selection approaches are completely different. Yang et al [13] classifies human grasp poses and adapts the robot's trajectory accordingly, i.e., their neural network can detect seven different grasp poses and grab a held cube at its centre. Alternatively, our approach implements an object-independent grasp planning algorithm that selects the best picking location based on the object's shape and therefore allows the transfer of unknown objects.…”
Section: Related Workmentioning
confidence: 99%
“…For the third-person perspective, the visual system which analyzes the human-human handover process can be used for demonstration of robot-related tasks, such as human-robot handover [32,26], robot-human handover [4,21,27], dexterous manipulation [22,23], etc. Besides, it can also serve as a analysis tool for cognitive study on human behaviour understanding [20,8].…”
Section: Perception In Handover Taskmentioning
confidence: 99%
“…Being aware of the markerless preference for some research topics [15,3], we also provide a photorealistic H2O-Syn dataset, where the poses of hand and object are transferred from H2O, and the textures of hand and object are scanned from the real counterparts. As human beings are generally the experts that an intelligent robot should learn the skills from, we believe this human-human interaction dataset can also serve as the demonstrations of various manipulation tasks for robots, such as human-robot handover [32,26], robot-human handover [4,21,27], and dexterous manipulation [22,23].…”
Section: Introductionmentioning
confidence: 99%
“…Recent advances in deep learning-based machine perception have substantially improved the accuracies of pose estimation from real-world sensory data [2][3][4][5][6]. The estimated 6-DoF object pose, represented by the translation and rotation in SE(3), serves as a compact and informative state representation for a variety of downstream tasks, such as robot grasping and manipulation [7], human-robot interactions [8], online camera calibration [9], and tele-presence robot control [10].…”
Section: Introductionmentioning
confidence: 99%