2022
DOI: 10.1109/tro.2022.3182503
|View full text |Cite
|
Sign up to set email alerts
|

Omni-Swarm: A Decentralized Omnidirectional Visual–Inertial–UWB State Estimation System for Aerial Swarms

Abstract: If it is the author's pre-published version, changes introduced as a result of publishing processes such as copy-editing and formatting may not be reflected in this document. For a definitive version of this work, please refer to the published version.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(20 citation statements)
references
References 72 publications
0
19
0
Order By: Relevance
“…We assume that the robots are equipped with a front-looking depth camera with limited sensing range and that they perform decentralized state estimation, using e.g. [23], to localize themselves and their peers in a common reference frame. Given the estimated relative poses, each UAV exchanges map information with nearby team members within the communication range for more informed decision-making.…”
Section: Methodsmentioning
confidence: 99%
“…We assume that the robots are equipped with a front-looking depth camera with limited sensing range and that they perform decentralized state estimation, using e.g. [23], to localize themselves and their peers in a common reference frame. Given the estimated relative poses, each UAV exchanges map information with nearby team members within the communication range for more informed decision-making.…”
Section: Methodsmentioning
confidence: 99%
“…Both methods [13], [14] depend on preset anchors which greatly limits their applications for multivehicle cases. Xu et al [15] proposed a decentralized state estimation system, fusing stereo wide-field-of-view cameras and UWB sensors for a multi-vehicle case. Similarly, Nguyen et al [16] proposed a visual-inertial-UWB multi-vehicle localization system that loosely fuses the UWB and visual-inertial odometry data while tightly fusing all onboard sensors.…”
Section: A Related Workmentioning
confidence: 99%
“…[22] uses optical flow sensors measuring velocity alongside a UWB and an IMU. Compared to underwater robots using Doppler anemometers [20], [21], [23], the performance of positioning with optical flow sensors may deteriorates with varying illumination or insufficient environmental textures [24]. Therefore, an effective positioning approach only using measurements from a UWB and an IMU is still essential to guarantee the robustness of the whole system [6], [15].…”
Section: A Single-range and Inertia Based Odometrymentioning
confidence: 99%
“…[30]- [33]. More comprehensively, an omnidirectional visual-inertial-UWB framework is further proposed for aerial swarm [24]. Compared to these implementations with optical sensors, the SRIO has a minimal hardware configuration and is still worth furhter investigation.…”
Section: B Observability Based Estimationmentioning
confidence: 99%