2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS) 2020
DOI: 10.1109/icdcs47774.2020.00184
|View full text |Cite
|
Sign up to set email alerts
|

Deploy-able Privacy Preserving Collaborative ML

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…[15]Federated learning using peer-to-peer network for decentralized orchestration of model weights [16]Peer-to-peer federated learning on graphs [17]Decentralized federated learning for electronic health records [18]Braintorrent: A peer-to-peer environment for decentralized federated learning [13]Lotteryfl: Personalized and communication-efficient federated learning with lottery ticket hypothesis on non-iid datasets [19]Deploy-able privacy preserving collaborative ml…”
Section: Synchronousmentioning
confidence: 99%
See 1 more Smart Citation
“…[15]Federated learning using peer-to-peer network for decentralized orchestration of model weights [16]Peer-to-peer federated learning on graphs [17]Decentralized federated learning for electronic health records [18]Braintorrent: A peer-to-peer environment for decentralized federated learning [13]Lotteryfl: Personalized and communication-efficient federated learning with lottery ticket hypothesis on non-iid datasets [19]Deploy-able privacy preserving collaborative ml…”
Section: Synchronousmentioning
confidence: 99%
“…A novel federated learning algorithm is presented in [29], where devices mostly collaborate with other devices in a pairwise manner, achieving 10X better communication efficiency. Strong guarantees of privacy with a marginal compromise in performance is shown in [19], which aims at preserving differential privacy of each participating client. They additionally experiment with quantization of model and discover that its deployment on edge devices does not degrade its capability.…”
Section: Systemization Of Knowledgementioning
confidence: 99%
“…[15]Federated learning using peer-to-peer network for decentralized orchestration of model weights [16]Peer-to-peer federated learning on graphs [17]Decentralized federated learning for electronic health records [18]Braintorrent: A peer-to-peer environment for decentralized federated learning [13]Lotteryfl: Personalized and communication-efficient federated learning with lottery ticket hypothesis on non-iid datasets [19]Deploy-able privacy preserving collaborative ml Asynchronous [20]Personalized and private peer-to-peer machine learning [21]Personalized cross-silo federated learning on non-iid data [22]Edge-consensus learning: Deep learning on p2p networks with non-homogeneous data [23]Towards on-device federated learning: A direct acyclic graphbased blockchain approach Gossip algorithms, as the name suggests, are a means of P2P communication in distributed systems by leaking information to the neighbors. These are a type of asynchronous algorithms and have been successfully employed in the area of decentralized optimization.…”
Section: Synchronousmentioning
confidence: 99%
“…[28] study a novel federated learning algorithm, in which devices mostly collaborate with other devices in a pairwise manner, achieving 10X better communication efficiency. [19] aims at preserving differential privacy of each participating client, and show strong guarantees on privacy with a marginal compromise in performance. They additionally experiment with quantization of model and discover that its deployment on edge devices does not degrade its capability.…”
Section: Systemization Of Knowledgementioning
confidence: 99%
See 1 more Smart Citation