2020
DOI: 10.48550/arxiv.2002.08196
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Learning in the Sky: Joint Power Allocation and Scheduling with UAV Swarms

Abstract: Unmanned aerial vehicle (UAV) swarms must exploit machine learning (ML) in order to execute various tasks ranging from coordinated trajectory planning to cooperative target recognition. However, due to the lack of continuous connections between the UAV swarm and ground base stations (BSs), using centralized ML will be challenging, particularly when dealing with a large volume of data. In this paper, a novel framework is proposed to implement distributed federated learning (FL) algorithms within a UAV swarm tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(22 citation statements)
references
References 19 publications
0
22
0
Order By: Relevance
“…Zeng et al [128] used a distributed swarm approach for the power control and scheduling of UAV flocks. In particular, they used distributed federated learning (FL) algorithms within a UAV swarm that consisted of a leading UAV and several following UAVs.…”
Section: Distributed and Ma-based Methodsmentioning
confidence: 99%
“…Zeng et al [128] used a distributed swarm approach for the power control and scheduling of UAV flocks. In particular, they used distributed federated learning (FL) algorithms within a UAV swarm that consisted of a leading UAV and several following UAVs.…”
Section: Distributed and Ma-based Methodsmentioning
confidence: 99%
“…Distributed ML over wireless networks: Recent literature concerning ML by wireless networks has shifted towards federated learning [23], [24], and is mostly focused on studying the convergence and behavior of federated learning over wireless networks [9], [10], [12], [25], [26], [27], [28], [29], [30], [31], [32], [33]. Conventional federated learning assumes training a single ML model for all the engaged devices.…”
Section: Related Workmentioning
confidence: 99%
“…Each x ∈ D i (t) is a data sample containing model features and (possibly) a target variable. Different from most of the existing literature in distributed learning where the data distributions of the devices/workers are assumed to be static [9], [10], [12], [25], [26], [27], [28], [29], [30], [31], [33], we consider online model training and deployment (captured via the notion of concept/model drift in Sec. 5), where the goal is to obtain a personalized ML model for each cluster c to use for real-time inference.…”
Section: Device Clusters and Data Distributionsmentioning
confidence: 99%
See 2 more Smart Citations