2020
DOI: 10.1016/j.robot.2020.103474
|View full text |Cite
|
Sign up to set email alerts
|

Learning robots to grasp by demonstration

Abstract: In recent years, we have witnessed the proliferation of so-called collaborative robots or cobots, that are designed to work safely along with human operators. These cobots typically use the "program from demonstration" paradigm to record and replay trajectories, rather than the traditional source-code based programming approach. While this requires less knowledge from the operator, the basic functionality of a cobot is limited to simply replay the sequence of actions as they were recorded.In this paper, we pre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 26 publications
(16 citation statements)
references
References 34 publications
0
11
0
Order By: Relevance
“…In [33], the researchers used sequential grasping data to achieve grasp recognition, which shows the potential of history proprioceptive data to realize autonomous grasping. Most of the prior works using human demonstrations focused on the grasp planning that generates the pre-grasp pose and contact points for the target object [10], [32], [34]. While little attention has been paid to the finger control and the grasp execution.…”
Section: Learning From Demonstration For Robotic Graspingmentioning
confidence: 99%
“…In [33], the researchers used sequential grasping data to achieve grasp recognition, which shows the potential of history proprioceptive data to realize autonomous grasping. Most of the prior works using human demonstrations focused on the grasp planning that generates the pre-grasp pose and contact points for the target object [10], [32], [34]. While little attention has been paid to the finger control and the grasp execution.…”
Section: Learning From Demonstration For Robotic Graspingmentioning
confidence: 99%
“…Several studies have been proposed to exploit data from captured expert manipulations such as [24], which learns interactions from videos of experts, or [25], [26], which uses custom handheld devices to collect grasping demonstration. Nevertheless, many recent studies [27]- [29] use the Cornell dataset [11] or similarly acquired datasets [30], [31] which provide thousands of grasp locations labeled by humans. These grasp demonstrations have been used to infer an evaluation function that ranks grasp candidates according to the expert specifications [20], [32], [33].…”
Section: Related Workmentioning
confidence: 99%
“…The proposed solution is shown to be capable of generating a viable grasp, on previously unseen objects in 1.3 seconds. De Coninck et al (2020) [7] show the application of a deep learning environment through demonstration, for application in robotic manipulators that cooperate with humans. Authors demonstrate the application of the algorithm on a Franka Panda collaborative robot, with a 90% average success rate of grasp.…”
Section: Introductionmentioning
confidence: 99%