1996
DOI: 10.1007/bf00117443
|View full text |Cite
|
Sign up to set email alerts
|

Robot programming by Demonstration (RPD): Supporting the induction by human interaction

Abstract: Abstract.Programming by Demonstration (PbD) is a programming method that allows software developers to add new functionalities to a system by simply showing them in the form of few examples. In the robotics domain it has the potential to reduce the amount of time required for programming and also to make programming more "'natural". Just imagine the task of assembling a torch by a manipulator. Wouldn't it be nice to just assemble the torch with one's own hands, watched by video and laser cameras and maybe wear… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

1999
1999
2014
2014

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(14 citation statements)
references
References 16 publications
0
12
0
Order By: Relevance
“…Robot-based demonstration uses robots as the demonstrator. This method mainly includes teleoperation [6] and kinesthetic teaching which is also called scaffolding [7]. On the contrary, human-based demonstration employs humans as the demonstrator.…”
Section: B Related Workmentioning
confidence: 99%
“…Robot-based demonstration uses robots as the demonstrator. This method mainly includes teleoperation [6] and kinesthetic teaching which is also called scaffolding [7]. On the contrary, human-based demonstration employs humans as the demonstrator.…”
Section: B Related Workmentioning
confidence: 99%
“…grasp, ungrasp and move, which are implemented using constrained motion planning. In this context, symbolic pre-and post-conditions can be learned automatically [16]. The identification of object features, which are relevant for a given task, is a prerequisite for learning and generalization of manipulation knowledge.…”
Section: Overviewmentioning
confidence: 99%
“…In most of the proposed approaches, the imitation process proceeds through three stages: (1) perception and analysis of human demonstration, (2) representation of the demonstration, and (3) reproduction of the demonstrated task on the robot. Known approaches in the literature can be divided between two trends regarding the way demonstrations are represented, and the way such representations are generated: trajectorylevel representations in the form of non-linear mappings between sensory and motor information [8][9][10][11][12][13][14], and symboliclevel representations that decompose demonstrations into sequences of more abstract perception-action units [15][16][17][18][19][20]. While trajectory-level representations allow different types of motions to be encoded, they do not allow high-level tasks to be generated.…”
Section: Introduction and Related Workmentioning
confidence: 99%
“…l' ,. wtiJ [19] Researchers have also addressed the practical methods by which demonstration data can be collected. The choice of sensors depends on the process requirements, such as accuracy and reliability.…”
Section: Offline Programmingmentioning
confidence: 99%
“…LIST OF FIGURES Figure 1.1: Drivers for the advancement of robotics 2 Figure 2.1: Traditional robot programming methods [7]: (a) lead-through programming, (b) walk-through programming, and (c) offline programming Figure 2.2: VR system by Natonek et al [9] Figure 2.3: VR system by Aleotti et al [10] Figure 2.4: VR for robot programming [11]: (a) actual robotic work cell, and (b) VR environment in a desktop-PC Figure 2.5: Principal architecture of a robot PbD system [19] Figure 2.6: Summary of robot PbD research 17 Figure 2.7: PbD system using a teaching gripper [21] 19 Figure 2.8: Real-Virtuality (RV) continuum (adapted from [32]) 20 Figure 2.9: A comparison of Human-Computer Interaction (HCI) styles [34]: (a) human-computer interaction isolated from human-real world interaction, (b) user immersed in computer world and isolated from real world, (c) user interacts with real world and computers placed in the real world, and (d) user uses a computer to aid in hislher interaction with the real world (AR) Figure 2.10: Technical issues to be considered when developing AR systems Figure 2.11: (a) Optical HMO, and (b) video HMO [33] Figure 2.12: Personal Interaction Panel (PIP) input device using a tracked pointer [37] Figure 2.13: Applications of AR: (a) maintenance [48], (b) mock tumour biopsy [33], (c) wire bundle assembly [51], (d)-(e) factory layout planning [52], and (f) human workspace in manual assembly station [52] Figure 2.18: Class I tasks: a) pick-and-place for disabled [56], b) pick-and-place for industrial applications [57], c) spot welding…”
mentioning
confidence: 99%