2014 IEEE International Conference on Robotics and Automation (ICRA) 2014
DOI: 10.1109/icra.2014.6906976
|View full text |Cite
|
Sign up to set email alerts
|

Ambient motion estimation in dynamic scenes using wearable visual-inertial sensors

Abstract: This paper proposes a method to estimate the motion of ambient objects including translational and rotational velocities by moving observers with hybrid visual-inertial sensors. Ambient motion is recovered from visual optical flows that represent ego and ambient dynamics. In this paper, each moving object is considered as a rigid body that has been segmented from the background using computer vision algorithms. In motion recovery, the fundamental challenge is to resolve the coupling between scene depths and tr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 14 publications
(19 reference statements)
0
5
0
Order By: Relevance
“…This will be beneficial for improving the performance in recognizing gestures which have similar trajectories but different hand shapes. We need to study motion estimation in dynamic 3D scene [51,52], which is to enable the robot to recognize gestures in motion. We also plan to design a real-time HRI system for practical application.…”
Section: Discussionmentioning
confidence: 99%
“…This will be beneficial for improving the performance in recognizing gestures which have similar trajectories but different hand shapes. We need to study motion estimation in dynamic 3D scene [51,52], which is to enable the robot to recognize gestures in motion. We also plan to design a real-time HRI system for practical application.…”
Section: Discussionmentioning
confidence: 99%
“…In this paper, we predict the corresponding features based on attitude estimations using onboard IMU. The movement of interested features is predicted as (19) where is the pixel position of a feature, and is the movement of features caused by camera rotation [2] (20)…”
Section: A Active Feature Searchmentioning
confidence: 99%
“…Similarly, a motion hazard alarm is given when an object is running toward the blind. Relative motion is analyzed by fusing video input and IMU measurements [2].…”
Section: A System Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…Similarly, a motion hazard alarm is given when an object moving toward the wearer. Relative motion is analyzed by fusing video inputs and IMU measurements [12]. The other supporting interactive functions are implemented as services on call.…”
Section: Wearable Blind Navigatormentioning
confidence: 99%