2017
DOI: 10.18502/keg.v2i2.603
|View full text |Cite
|
Sign up to set email alerts
|

Depth Estimation of an Underwater Object Using a Single Camera

Abstract: <p>Underwater robotics is currently a growing field. To be able to autonomously find and collect objects on the land and in the air is a complicated problem, which is only compounded within the underwater setting. Different techniques have been developed over the years to attempt to solve this problem, many of which involve the use of expensive sensors. This paper explores a method to find the depth of an object within the underwater setting, using a single camera source and a known object. Once this kno… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 7 publications
(8 reference statements)
0
2
0
Order By: Relevance
“…For this study, the robot uses the camera sensor, as it has been shown in the literature that cameras can be used to get the distance and position of objects relative to themselves [38,74].…”
Section: Robotic Fishmentioning
confidence: 99%
See 1 more Smart Citation
“…For this study, the robot uses the camera sensor, as it has been shown in the literature that cameras can be used to get the distance and position of objects relative to themselves [38,74].…”
Section: Robotic Fishmentioning
confidence: 99%
“…The first is by using two cameras next to one another and calculating the distance of an object within the two camera feeds. A single-camera approach offers an alternative solution that, despite its complexity, enables distance estimation by leveraging pixel density and object size information [74]. For the simulation, the distance to an object was determined using the ray cast function within Unity to reduce the computation and complexity.…”
Section: Swarm Algorithmmentioning
confidence: 99%