2019 14th IEEE Conference on Industrial Electronics and Applications (ICIEA) 2019
DOI: 10.1109/iciea.2019.8833835
|View full text |Cite
|
Sign up to set email alerts
|

Development of an Optical Tracking Based Teleoperation System with Virtual Reality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 8 publications
0
3
0
Order By: Relevance
“…A similar approach has Fig. 1 Preliminary test with sensor placement on a robot TCP visualization with raw and filtered sensor data been presented in [30,33] in which human motion tracking is implemented for running in Unity3D and robot operating system (ROS). In [36], a hidden Markov model to compensate latency of human motion has been presented.…”
Section: Related Workmentioning
confidence: 99%
“…A similar approach has Fig. 1 Preliminary test with sensor placement on a robot TCP visualization with raw and filtered sensor data been presented in [30,33] in which human motion tracking is implemented for running in Unity3D and robot operating system (ROS). In [36], a hidden Markov model to compensate latency of human motion has been presented.…”
Section: Related Workmentioning
confidence: 99%
“…A second press returns the end-effector to the toolbox and the arm to the HOME position. In order to communicate information between the UI and the arm, a method to translate the imitation tool's potentiometers data and key functions was developed that uses Rosbridge Websocket to send /sensor msgs/joy message data types to the arm [8]. Note that ROS is run on Raspberry Pi 4 at both sides.…”
Section: User Interfacementioning
confidence: 99%
“…While traditional approaches are practical for industrial tasks, their third-dimensional responsiveness limitation may make them less intuitive for new users and harder to pilot in situations where users do not have a direct line of sight to the robot as in remote control [21] [22]. VR controllers allow users to direct the robot end-effector with hand gestures, a control method that is intuitive and natural to the user, enabling flawless human-robot interactions [23]. This strategy creates the impression that the robot is directly responsive to human hand movements.…”
Section: Introductionmentioning
confidence: 99%