2014 IEEE-RAS International Conference on Humanoid Robots 2014
DOI: 10.1109/humanoids.2014.7041357
|View full text |Cite
|
Sign up to set email alerts
|

Humanizing NAO robot teleoperation using ROS

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 31 publications
(14 citation statements)
references
References 12 publications
0
14
0
Order By: Relevance
“…During calibration a complex reference coordination between Kinect and robot coordinate frames requires a lot of code and skills that therapists or educators do not have. Often the operator stays at predefined area in front of the Kinect view [7] or complex transformation matrices are used for calibration between Kinect and NAO coordinate systems since different areas of the input space require different compensations and scale independence.…”
Section: Existing Solutionsmentioning
confidence: 99%
“…During calibration a complex reference coordination between Kinect and robot coordinate frames requires a lot of code and skills that therapists or educators do not have. Often the operator stays at predefined area in front of the Kinect view [7] or complex transformation matrices are used for calibration between Kinect and NAO coordinate systems since different areas of the input space require different compensations and scale independence.…”
Section: Existing Solutionsmentioning
confidence: 99%
“…The robot was required to give some explanations about itself, and to produce a verse given the rhymes. The robot was teleoperated by human gestures captured by a Kinect sensor (see [27]); NAO gesticulated while chatting, and moved around the stage according to the teleoperator commands (video available at [25]). Fig.…”
Section: Body Language Developmentmentioning
confidence: 99%
“…A real-time human imitation system based on non-invasive image processing techniques has been proposed in [3], but authors use input coming from RGBD images. A Microsoft Kinect has been used as optical motion capture sensor for arm control in [22,27]. Head pose angles have been estimated using the Kinect for a teleoperation scenario for the Furhat robot head in [28], while a learning scenarios for people with autism spectrum disorder has been proposed in [29].…”
Section: Introductionmentioning
confidence: 99%