2007
DOI: 10.1109/robot.2007.364129
|View full text |Cite
|
Sign up to set email alerts
|

Robust and Real-time Rotation Estimation of Compound Omnidirectional Sensor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
3
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…The motion estimation unit calculates rotation matrices by matching method in feature points of adjacent frames using the estimation method called RANSAC (RANdom SAmple Consensus), within assigned time as illustrated in Fig.5. After the calculation, the rotation matrix with the highest goodness of fit is estimated as the true rotation matrix that represents inter-frame motion [3]. The inter-frame distances of roll, pitch and yaw represented by the true rotation matrix are converted into inter-frame distances of roll, x and y.…”
Section: B Motion Estimation Unitmentioning
confidence: 99%
“…The motion estimation unit calculates rotation matrices by matching method in feature points of adjacent frames using the estimation method called RANSAC (RANdom SAmple Consensus), within assigned time as illustrated in Fig.5. After the calculation, the rotation matrix with the highest goodness of fit is estimated as the true rotation matrix that represents inter-frame motion [3]. The inter-frame distances of roll, pitch and yaw represented by the true rotation matrix are converted into inter-frame distances of roll, x and y.…”
Section: B Motion Estimation Unitmentioning
confidence: 99%
“…The process of egomotion estimation naturally consists in estimating the optical flow (Gluckman & Nayar, 1998;Vassallo et al, 2002;Shakernia et al, 2003;Lim & Barnes, 2008;Gandhi & Trivedi, 2005), or features correspondences (Svoboda et al, 1998;Lee et al, 2000;Thanh et al, 2008), and then extracting the 3D camera motion from the 2D information computed in images. In omnidirectional vision, several methods, proposed in last few years, are concerned with egomotion estimation from motion field.…”
Section: Introductionmentioning
confidence: 99%
“…Lee et al (2000) adapt the motion estimation method to large motions by using a novel Recursive Rotation Factorization (RRF) that removes the image motions due to rotation. Recently, Thanh et al (2008) detect image features in omnidirectional images using a conventional feature detector and then classify it into near and far features. Rotation is recovered using far features, and then translation is estimated from near features using the estimated rotation.…”
Section: Introductionmentioning
confidence: 99%
“…Such a jerky motion affects the quality of RPs. Video stabilization [6,8] by matching or tracking 2D points, lines, or regions and finding optical flow has been explored. However, such methods have not produced perfect results for a vehicle moving in the city.…”
Section: Introductionmentioning
confidence: 99%