Driven by emerging technologies such as edge computing and Internet of Things (IoT), recent years have witnessed the increasing growth of data processing in a distributed way. Federated Learning (FL), a novel decentralized learning paradigm that can unify massive devices to train a global model without compromising privacy, is drawing much attention from both academics and industries. However, the performance dropping of FL running in a heterogeneous and asynchronous environment hinders its wide applications, such as in autonomous driving and assistive healthcare. Motivated by this, we propose a novel mechanism, called Fed2A: Federated learning mechanism in Asynchronous and Adaptive Modes. Fed2A supports FL by (1) allowing clients and the collaborator to work separately and asynchronously, (2) uploading shallow and deep layers of deep neural networks (DNNs) adaptively, and (3) aggregating local parameters by weighing on the freshness of information and representational consistency of model layers jointly. Moreover, the effectiveness and efficiency of Fed2A are also analyzed based on three standard datasets, i.e., FMNIST, CIFAR-10, and GermanTS. Compared with the best performance among three baselines, i.e., FedAvg, FedProx, and FedAsync, Fed2A can reduce the communication cost by over 77%, as well as improve model accuracy and learning speed by over 19% and 76%, respectively.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.