2019
DOI: 10.48550/arxiv.1910.04956
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Central Server Free Federated Learning over Single-sided Trust Social Networks

Chaoyang He,
Conghui Tan,
Hanlin Tang
et al.

Abstract: Federated learning has become increasingly important for modern machine learning, especially for data privacy-sensitive scenarios. Existing federated learning mostly adopts the central server-based architecture or centralized architecture. However, in many social network scenarios, centralized federated learning is not applicable (e.g., a central agent or server connecting all users may not exist, or the communication cost to the central server is not affordable). In this paper, we consider a generic setting: … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(28 citation statements)
references
References 16 publications
0
28
0
Order By: Relevance
“…The participating devices or clients updates their model on their dataset, and aggregates it along with the model updates from their neighbours [62]. Furthering that, authors in [63] build a framework for an FL environment towards a generic social network scenario. Their Online Push-Sum (OPS) method handles the complex topology whilst simultaneously having optimal convergence rates.…”
Section: ) Decentralizedmentioning
confidence: 99%
“…The participating devices or clients updates their model on their dataset, and aggregates it along with the model updates from their neighbours [62]. Furthering that, authors in [63] build a framework for an FL environment towards a generic social network scenario. Their Online Push-Sum (OPS) method handles the complex topology whilst simultaneously having optimal convergence rates.…”
Section: ) Decentralizedmentioning
confidence: 99%
“…In the most decentralized FL work [15], [16], [17], [5], [31], [34], we noticed that the model parameters are transferred among data nodes directly, which occupies a lot of communication overhead and could causes serious communication cost when blockchain is employed. For instance, gas fee is required in popular blockchain Ethereum, where the cost could be significant high for large models.…”
Section: Ipfs Based Data Sharing Schemementioning
confidence: 99%
“…Additionally, the aforementioned centralized FL frameworks could bring security concerns and suffer the risk of single point failure. Through the literature review, the decentralized FL framework [15], [16], [17] has been proposed. The decentralized FL framework removes the centralized node and synchronizes FL updates among the data nodes, then performs aggregation.…”
Section: Introductionmentioning
confidence: 99%
“…Conventional federated learning frameworks consider a centralized communication architecture in which all communication between the mobile devices goes through a central server [1], [2], [3]. More recently, decentralized federated learning architectures without a central server have been considered for peer-to-peer learning on graph topologies [12] and in the context of social networks [13]. Model poisoning attacks on federated learning architectures have been analyzed in [32], [33].…”
Section: Related Workmentioning
confidence: 99%
“…The second one is a decentralized setup where mobile devices communicate directly with each other via an underlay communication network (e.g., a peer-to-peer network) [12], [13] without requiring a central server for secure model aggregation. Moreover, Turbo-Aggregate allows additional parallelization opportunities for communication, such as broadcasting and multi-casting.…”
Section: Introductionmentioning
confidence: 99%