Procedings of the British Machine Vision Conference 2002 2002
DOI: 10.5244/c.16.77
|View full text |Cite
|
Sign up to set email alerts
|

Tightly Integrated Sensor Fusion for Robust Visual Tracking

Abstract: This paper presents novel methods for increasing the robustness of visual tracking systems by incorporating information from inertial sensors. We show that more can be achieved than simply combining the sensor data within a statistical filter. In particular we show how, in addition to using inertial data to provide predictions for the visual sensor, this data can also be used to provide an estimate of motion blur for each feature and this can be used to dynamically tune the parameters of each feature detector … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
48
0

Year Published

2004
2004
2015
2015

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 35 publications
(48 citation statements)
references
References 16 publications
0
48
0
Order By: Relevance
“…For a start, they are one-dimensional, and so locally is motion blur: One would therefore expect that even in a heavily blurred image, some edge features (those parallel to the local direction of blur) may remain intact. Further, even edges which are affected by blur can be tracked: Klein and Drummond [8] demonstrated that blurred edges can be used for tracking in real-time if an estimate for the magnitude of motion blur is known a priori. This motivates us to include intensity edges into the SLAM map.…”
Section: Real-time Monocular Slam Implementation Was Demonstrated Bymentioning
confidence: 99%
See 3 more Smart Citations
“…For a start, they are one-dimensional, and so locally is motion blur: One would therefore expect that even in a heavily blurred image, some edge features (those parallel to the local direction of blur) may remain intact. Further, even edges which are affected by blur can be tracked: Klein and Drummond [8] demonstrated that blurred edges can be used for tracking in real-time if an estimate for the magnitude of motion blur is known a priori. This motivates us to include intensity edges into the SLAM map.…”
Section: Real-time Monocular Slam Implementation Was Demonstrated Bymentioning
confidence: 99%
“…Rapid unmodelled accelerations cause problems for data association and can further lead to the use of an incorrect blur estimate for edge tracking -for example, when the camera rotates suddenly from rest. In [8] Klein and Drummond employed inertial sensors to provide a rotation prediction for each frame to work around this problem. Here we propose an alternative: In Section 6 we describe a procedure to estimate inter-frame rotation by full-frame direct minimisation.…”
Section: Real-time Monocular Slam Implementation Was Demonstrated Bymentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, visual perception is highly challenged in many cases: occlusions, cluttered backgrounds, and image blurring because of fast motions either in objects or camera. To overcome these limitations of visual perception, it is often combined with motion estimation [13] or tactile sensing [1,10]. Skotheim et al [22] manipulation.…”
Section: For Precise Detailsmentioning
confidence: 99%