IEEE INFOCOM 2019 - IEEE Conference on Computer Communications 2019
DOI: 10.1109/infocom.2019.8737464
|View full text |Cite
|
Sign up to set email alerts
|

Federated Learning over Wireless Networks: Optimization Model Design and Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
461
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1

Relationship

2
7

Authors

Journals

citations
Cited by 757 publications
(463 citation statements)
references
References 10 publications
1
461
0
1
Order By: Relevance
“…In addition to this, the framework may suffer from slower convergence due to fewer participation. Thus, MEC will avoid deliberately dropping the clients to achieve a faster consensus with (22). Furthermore, using the relationship defined in (19) between x(ǫ) and relative θ accuracy for the subproblem, we can analyze the impact of responses θ on MEC server's utility in a FL setting with the constraint (11).…”
Section: Mec Server Utility Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition to this, the framework may suffer from slower convergence due to fewer participation. Thus, MEC will avoid deliberately dropping the clients to achieve a faster consensus with (22). Furthermore, using the relationship defined in (19) between x(ǫ) and relative θ accuracy for the subproblem, we can analyze the impact of responses θ on MEC server's utility in a FL setting with the constraint (11).…”
Section: Mec Server Utility Modelmentioning
confidence: 99%
“…The utility maximization problem in (21) is a convex optimization problem whose optimal solution can be obtained by using Lagrangian duality. The lagrangian of (21) where λ ≥ 0 is the Lagrangian multiplier for constraint (22). By taking the first-order derivative of (A.1) with respect to x(ǫ) and λ, KKT conditions are expressed as follows:…”
Section: Appendix a Kkt Solutionmentioning
confidence: 99%
“…Several studies have considered resource optimization in federated learning [14], [17]- [19]. For example, in [17], resource optimization and incentive mechanism for federated learning at the network edge were presented.…”
Section: Related Workmentioning
confidence: 99%
“…Conversely, federated learning (especially suitable for mobile networks) aims to distribute data and computation tasks among federated devices that are coordinated by a central server. The server is in charge of combining the local models into a common neural network, which is based on, and updated according to, the local datasets [45]. Finally, the goal of (deep) transfer learning is to apply the knowledge from one task to another in a related context to reduce the amount of data required for training and validating new models [46].…”
Section: Machine Learning Principles and Techniquesmentioning
confidence: 99%