2023
DOI: 10.48550/arxiv.2303.08581
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Model Extraction Attacks on Split Federated Learning

Abstract: Federated Learning (FL) is a popular collaborative learning scheme involving multiple clients and a server. FL focuses on protecting clients' data but turns out to be highly vulnerable to Intellectual Property (IP) threats. Since FL periodically collects and distributes the model parameters, a free-rider can download the latest model and thus steal model IP. Split Federated Learning (SFL), a recent variant of FL that supports training with resource-constrained clients, splits the model into two, giving one par… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
11
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(12 citation statements)
references
References 23 publications
0
11
0
Order By: Relevance
“…In finetuning case when the model is mostly static, attackers can obtain consistent gradient information through gradient queries. It is the consistent gradients that contribute to the success of MEAs such as Craft-ME and GAN-ME in (Li et al 2023b).…”
Section: Inconsistent Gradient Problemmentioning
confidence: 99%
See 4 more Smart Citations
“…In finetuning case when the model is mostly static, attackers can obtain consistent gradient information through gradient queries. It is the consistent gradients that contribute to the success of MEAs such as Craft-ME and GAN-ME in (Li et al 2023b).…”
Section: Inconsistent Gradient Problemmentioning
confidence: 99%
“…These inconsistent gradients present a challenge for gradient-based MEAs to effectively utilize the gradient information. Prior works (Li et al 2023b) demonstrate that for training-from-scratch case, gradient-based MEAs result in a large accuracy drop of over 40% in the surrogate model.…”
Section: Inconsistent Gradient Problemmentioning
confidence: 99%
See 3 more Smart Citations