2021
DOI: 10.48550/arxiv.2111.04287
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

BlueFog: Make Decentralized Algorithms Practical for Optimization and Deep Learning

Abstract: Decentralized algorithm is a form of computation that achieves a global goal through local dynamics that relies on low-cost communication between directly-connected agents. On large-scale optimization tasks involving distributed datasets, decentralized algorithms have shown strong, sometimes superior, performance over distributed algorithms with a central node. Recently, developing decentralized algorithms for deep learning has attracted great attention. They are considered as low-communication-overhead altern… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 65 publications
0
3
0
Order By: Relevance
“…Moreover, each node samples b = 16 data per iteration. Since this experiment is on a real distributed GPU system, we utilize BlueFog [Ying et al, 2021b] for the implementation of decentralized methods including topology organization, weight matrix generation, and efficient decentralized communication. For the implementation of C-SGD, we utilize PyTorch's native Distributed Data Parallel (DDP) module.…”
Section: Further Details Of Resnet-20 Experimentsmentioning
confidence: 99%
“…Moreover, each node samples b = 16 data per iteration. Since this experiment is on a real distributed GPU system, we utilize BlueFog [Ying et al, 2021b] for the implementation of decentralized methods including topology organization, weight matrix generation, and efficient decentralized communication. For the implementation of C-SGD, we utilize PyTorch's native Distributed Data Parallel (DDP) module.…”
Section: Further Details Of Resnet-20 Experimentsmentioning
confidence: 99%
“…Nowadays, there are many FL frameworks. The most prominent TensorFlow Federated (TFF) [1], [2] and BlueFog [3] work well in cloud-edge continuum. However, they are not deployable to edge only, they are not supported on OS Windows, and they have numerous dependencies that make their installation far from trivial.…”
Section: Introductionmentioning
confidence: 99%
“…Decentralized optimization is an emerging learning paradigm in which each node only communicates with its immediate neighbors per iteration. By avoiding the central server and maintaining a more balanced communication between each pair of connected nodes, decentralized approaches can significantly speedup the training process of large-scale machine learning models [4,10,43]. Although decentralized optimization has been extensively studied in literature, its performance limits with time-varying communication patterns has not been fully explored.…”
Section: Introductionmentioning
confidence: 99%