2018
DOI: 10.1007/978-3-319-97586-3_16
|View full text |Cite
|
Sign up to set email alerts
|

An Intuitive Robot Learning from Human Demonstration

Abstract: This paper presents a new way to teach a robot certain motions remotely from human demonstrator. The human and robot interface is built using a Kinect sensor which is connected directly to a remote computer that runs on processing software. The Cartesian coordinates is extracted, converted into joint angles and sent to the workstation for the control of the Sawyer robot. Kinesthetic teaching was used to correct the reproduced demonstrations while only valid resolved joint angles are recorded to ensure consiste… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…9, which is comprised of 7-degrees-offreedom (7-DOF) in a single arm configuration. The Sawyer robot is embedded with several sensors and actuators that are essential for human-robot interaction, including a motor encoder equipped at each joint to measure joint angles and a torque sensor at each joint to measure joint torques [29]. The Sawyer robot runs on the Robot Operating System (ROS) platform.…”
Section: Teaching By Demonstrationmentioning
confidence: 99%
“…9, which is comprised of 7-degrees-offreedom (7-DOF) in a single arm configuration. The Sawyer robot is embedded with several sensors and actuators that are essential for human-robot interaction, including a motor encoder equipped at each joint to measure joint angles and a torque sensor at each joint to measure joint torques [29]. The Sawyer robot runs on the Robot Operating System (ROS) platform.…”
Section: Teaching By Demonstrationmentioning
confidence: 99%
“…This enables the operator to wear the bracelet and intuitively move the cobot to perform a task as if he/she is performing it himself/herself. Ogenyi et al created an algorithm through which a cobot watches and learns from an operator performing a task [23]. Then, the operator corrects and improves the cobot's learnt demonstration using kinaesthetic teaching.…”
Section: Related Workmentioning
confidence: 99%
“…Ogenyi et al combined observational learning and kinaesthetic teaching [19]. In the research, a robot learns by 'watching' the movement of the arm of a human operator.…”
Section: Lfd and Gmm/gmrmentioning
confidence: 99%