Proceedings of the 29th International Symposium on High-Performance Parallel and Distributed Computing 2020
DOI: 10.1145/3369583.3392686
|View full text |Cite
|
Sign up to set email alerts
|

TiFL: A Tier-based Federated Learning System

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
80
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
4
2

Relationship

1
9

Authors

Journals

citations
Cited by 202 publications
(104 citation statements)
references
References 10 publications
0
80
0
Order By: Relevance
“…There are several popular platforms for FL, which have been summarized in Table 3 in terms of their focus and supporting software. There are plenty of platforms and architectures for FL, with more examples including [53], [54], [55], [56]. These platforms and architectures help refine FL better.…”
Section: Figure 10 Framework Of Fadlmentioning
confidence: 99%
“…There are several popular platforms for FL, which have been summarized in Table 3 in terms of their focus and supporting software. There are plenty of platforms and architectures for FL, with more examples including [53], [54], [55], [56]. These platforms and architectures help refine FL better.…”
Section: Figure 10 Framework Of Fadlmentioning
confidence: 99%
“…The row-wise partitioning of weight matrices induces neuron partitioning in each layer so that all computations related to a neuron are performed by a single processor. As shown in (8) and (9), to perform W k m x k −1 , processor P m needs to receive all x k −1 -vector rows corresponding to column indices in cols(W k m ). It is important to note that vectors x k−1 mn andx k −1 nm are placeholders that keep coordinates of nonzero entries.…”
Section: Parallel Sparse Feedforwardmentioning
confidence: 99%
“…Despite fruitful research results on FL regarding advanced learning algorithms [7,8], data heterogeneity [9,10], personalization [11][12][13][14], fairness [15,16], system design [17][18][19] and privacypreserving frameworks [20][21][22] etc., there are only a few focusing on hyper-parameters tuning for FL [23,4,[24][25][26]3]. In [23], Dai et.al.…”
Section: Related Workmentioning
confidence: 99%