Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication
DOI: 10.1109/roman.2002.1045668
|View full text |Cite
|
Sign up to set email alerts
|

The MORPHA style guide for icon-based programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
15
0
1

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(16 citation statements)
references
References 1 publication
0
15
0
1
Order By: Relevance
“…In particular, it has found good applications in robotics to improve HRI, such as teleoperations [5][6][7], industrial operations [8,9], etc., where AR can assist the operators in planning and simulating the task through interacting with the spatial environment prior to actual task execution. It has been reported that AR-based interfaces provide the means to maintain the situational awareness [7], as well as to facilitate different levels of HRI, such as the understanding of the robot perception of the world during debugging and development of the robot programs [10], the extension of the operator's perception of the real environment during robotic task planning and manipulation [11,12], as well as the integration of various interaction modalities in localization, navigation and planning of the mobile robots [13,14], etc. In most of the cases, AR-based human-robot interfaces permit the operators to visualize both the virtual information and real world environment simultaneously, where the virtual elements represent visual cues and enhancements for better understanding of the environment in robot task planning and execution [6,12,13].…”
Section: Introductionmentioning
confidence: 99%
“…In particular, it has found good applications in robotics to improve HRI, such as teleoperations [5][6][7], industrial operations [8,9], etc., where AR can assist the operators in planning and simulating the task through interacting with the spatial environment prior to actual task execution. It has been reported that AR-based interfaces provide the means to maintain the situational awareness [7], as well as to facilitate different levels of HRI, such as the understanding of the robot perception of the world during debugging and development of the robot programs [10], the extension of the operator's perception of the real environment during robotic task planning and manipulation [11,12], as well as the integration of various interaction modalities in localization, navigation and planning of the mobile robots [13,14], etc. In most of the cases, AR-based human-robot interfaces permit the operators to visualize both the virtual information and real world environment simultaneously, where the virtual elements represent visual cues and enhancements for better understanding of the environment in robot task planning and execution [6,12,13].…”
Section: Introductionmentioning
confidence: 99%
“…human gestures) [8,43] have been employed to teach assembly line robots. However, these methods either required the user to demonstrate an optimal trajectory or interactively show the complete sequence of actions which the robot remembered for future use.…”
Section: Related Workmentioning
confidence: 99%
“…Other well known languages are LabView, UML, MATLAB/Simulink and RCX. Using a touch screen as input device, icon-based programming languages such as in [24] can also lower the threshold to robot programming. There are also experimental systems using human programmer's gestures as a tool for pointing the intended robot locations [25].…”
Section: Related Workmentioning
confidence: 99%