2013 IEEE/RSJ International Conference on Intelligent Robots and Systems 2013
DOI: 10.1109/iros.2013.6696524
|View full text |Cite
|
Sign up to set email alerts
|

Mutual localization: Two camera relative 6-DOF pose estimation from reciprocal fiducial observation

Abstract: Abstract-Concurrently estimating the 6-DOF pose of multiple cameras or robots-cooperative localization-is a core problem in contemporary robotics. Current works focus on a set of mutually observable world landmarks and often require inbuilt egomotion estimates; situations in which both assumptions are violated often arise, for example, robots with erroneous low quality odometry and IMU exploring an unknown environment. In contrast to these existing works in cooperative localization, we propose a cooperative lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 31 publications
(51 reference statements)
0
8
0
Order By: Relevance
“…Numerous experimental solutions based on vision and mutual observation of UAV and UGVs equipped with known geometrical markers were tested [16], [17]. Our previous solution uses circular visual markers [2], [3] for mutual localization in small swarms of UAVs [5], [6] and in heterogeneous formations [4].…”
Section: A State Of the Art And Contributionsmentioning
confidence: 99%
“…Numerous experimental solutions based on vision and mutual observation of UAV and UGVs equipped with known geometrical markers were tested [16], [17]. Our previous solution uses circular visual markers [2], [3] for mutual localization in small swarms of UAVs [5], [6] and in heterogeneous formations [4].…”
Section: A State Of the Art And Contributionsmentioning
confidence: 99%
“…Initially the problem was analytically solved in 2D [32], and then the analytical solution was extended in 3D [13]. At the same time, Dhima et al [33] produced a numerical solution, clearly a more computationally expensive and less accurate formulation. The analytical solution was further used to assist the flying formation of quadrotors [8].…”
Section: Related Workmentioning
confidence: 99%
“…In an indoor-only setup, colorbased markers can be used, see [9], [10], that are easy to segment under controlled lighting conditions, but not in the extremely unpredictable lighting conditions and the multicolored outdoor environment. For outdoors, black and white markers are preferred, leading to solutions which combine passive markers and object detection, see [11], [12], used for swarms [2], [13] and heterogeneous groups of robots [14], [15], [16]. The drawbacks of these approaches are the need for large markers, computational complexity and sensitivity to lighting conditions.…”
Section: Introductionmentioning
confidence: 99%