Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2007
DOI: 10.3182/20070919-3-hr-3904.00065
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Vision Based Auv Navigation System Using a Complementary Sensor Suite

Abstract: This paper proposes a real-time navigation system for an AUV that takes advantage of the complementary performance of a sensor suite including a DVL, a compass, a depth sensor and altimeter sensors with a feature based motion estimator using vision. To allow for real-time performance of the vision based motion estimator a simple but fast correlation algorithm is used for feature matching. The compass and the depth sensors are used to bound the drift of the heading and depth estimations respectively. The altime… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2008
2008
2022
2022

Publication Types

Select...
2
2
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 7 publications
(8 reference statements)
0
2
0
Order By: Relevance
“…The work of the authors in vision based UUV navigation with the addition of a Scale Invariant Feature Transform -SIFT based motion estimation algorithm to improve image correspondence and the use of the latest Graphical Processing Units (GPU) technology in order to perform in real-time is reported in , Horgan et al, 2007.…”
Section: Introductionmentioning
confidence: 99%
“…The work of the authors in vision based UUV navigation with the addition of a Scale Invariant Feature Transform -SIFT based motion estimation algorithm to improve image correspondence and the use of the latest Graphical Processing Units (GPU) technology in order to perform in real-time is reported in , Horgan et al, 2007.…”
Section: Introductionmentioning
confidence: 99%
“…The use of inertial measurements also reduces the amount of visual information required for extraction from the vision system resulting in a more simple and robust solution (Huster et al 2002).While vision based motion estimation techniques rely on the fusion of altitude measurements from sensors to estimate metric displacement (Cufi et al 2002), Eustice ) also takes advantage of other sensor information (attitude) in order to overcome many of the challenging issues involved in visual SLAM based navigation in an unstructured environment. The authors of this chapter, Horgan et al propose a real-time navigation system for a UUV that takes advantage of the complementary performance of a sensor suite including a DVL, a compass, a depth sensor and altimeter sensors with a feature based motion estimator using vision (Horgan et al 2007). The compass and the depth sensors are used to bound the drift of the heading and depth estimations respectively.…”
Section: Navigation Using Sensor Fusionmentioning
confidence: 99%