2022
DOI: 10.3390/mps5040060
|View full text |Cite
|
Sign up to set email alerts
|

Performance and Information Leakage in Splitfed Learning and Multi-Head Split Learning in Healthcare Data and Beyond

Abstract: Machine learning (ML) in healthcare data analytics is attracting much attention because of the unprecedented power of ML to extract knowledge that improves the decision-making process. At the same time, laws and ethics codes drafted by countries to govern healthcare data are becoming stringent. Although healthcare practitioners are struggling with an enforced governance framework, we see the emergence of distributed learning-based frameworks disrupting traditional-ML-model development. Splitfed learning (SFL) … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…Therefore, each client-side model sends its activations to a single common server-side sub-network, thereby reducing the required aggregation step and the need to keep multiple copies of the server-side networks as compared to the first variant as shown in Figure 10 (b). Moreover, as the server keeps only one copy of the server-side sub-network, it makes the server-side do forward and backward pass sequentially with each of the client's data (activations of the cut layer) [121,122].…”
Section: ) Split Learningmentioning
confidence: 99%
See 4 more Smart Citations
“…Therefore, each client-side model sends its activations to a single common server-side sub-network, thereby reducing the required aggregation step and the need to keep multiple copies of the server-side networks as compared to the first variant as shown in Figure 10 (b). Moreover, as the server keeps only one copy of the server-side sub-network, it makes the server-side do forward and backward pass sequentially with each of the client's data (activations of the cut layer) [121,122].…”
Section: ) Split Learningmentioning
confidence: 99%
“…Splitfed overcomes the drawback of federated learning of training a large ML model in resourceconstrained ESs [123]. At the same time, it eliminates the weakness of split learning to deal with one client at a time while training [122].…”
Section: Key Takeawaysmentioning
confidence: 99%
See 3 more Smart Citations