2022
DOI: 10.1109/lra.2022.3171096
|View full text |Cite
|
Sign up to set email alerts
|

Agile Formation Control of Drone Flocking Enhanced With Active Vision-Based Relative Localization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(14 citation statements)
references
References 30 publications
0
12
0
Order By: Relevance
“…As a result, robots may have collisions with each other and have chaos in the swarm [16]. Besides, position measurement drifts may also cause robots to reach wrong destinations [20], [21].…”
Section: Start Goalmentioning
confidence: 99%
See 1 more Smart Citation
“…As a result, robots may have collisions with each other and have chaos in the swarm [16]. Besides, position measurement drifts may also cause robots to reach wrong destinations [20], [21].…”
Section: Start Goalmentioning
confidence: 99%
“…In this paper, the robust curve virtual tube passing through problem is summarized and solved. To achieve collision avoidance among robots, a traditional method is to share selfobservation positions with the known initial positions among robots, which suffers heavily from the position measurement drift and communication uncertainties [21]. The controller proposed in this paper can work autonomously without wireless communication and other robots' IDs, whose premise is that all robots have omnidirectional relative localization equipment to achieve precise relative navigation, namely robots can get their neighboring robots' relative position and relative velocity precisely.…”
Section: Start Goalmentioning
confidence: 99%
“…When extra sensors, e.g., camera, visual-inertial odometry (VIO), etc, are included alongside the UWB [2], [31], as the observability is commonly not considered as a main issue ignoring sensor failures, the sliding window size and the computational efficiency have been not well considered in existing work [32], [39], [40].…”
Section: Sliding Window Filteringmentioning
confidence: 99%
“…In addition to the traditional algorithms mentioned above, the rapid development of deep learning has enabled it to be applied in many fields [14][15][16][17][18], including the field of UAVs. Vision-based UAV formation maintenance has improved the performance of these algorithms, but still faces the problem of huge communication requirements [19,20] in scenarios with electromagnetic silence.…”
Section: Introductionmentioning
confidence: 99%