GLOBECOM 2022 - 2022 IEEE Global Communications Conference 2022
DOI: 10.1109/globecom48099.2022.10000612
|View full text |Cite
|
Sign up to set email alerts
|

Communication-Efficient and Privacy-Preserving Feature-based Federated Transfer Learning

Abstract: Federated learning (FL) is an emerging paradigm for training deep neural networks (DNNs) in distributed manners. Current FL approaches all suffer from high communication overhead and information leakage. In this work, we present a federated learning algorithm based on evolution strategies (FedES), a zeroth-order training method. Instead of transmitting model parameters, FedES only communicates loss values, and thus has very low communication overhead. Moreover, a third party is unable to estimate gradients wit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 89 publications
(109 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?