2022
DOI: 10.1016/j.future.2021.10.017
|View full text |Cite
|
Sign up to set email alerts
|

Non-interactive verifiable privacy-preserving federated learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 23 publications
0
7
0
Order By: Relevance
“…Bonawitz et al [38] built a privacy-preserving FL framework, where SMC can securely aggregate the clients' parameters, and be robust to clients' exit. Xu et al [39] proposed a non-interactive and verifiable privacy protection FL model, and proposed a novel privacy gradient aggregation scheme using random matrix coding and secure two-party computation. These SMC models lead to huge communication cost because of multiple interactions involving in the learning process.…”
Section: Secure Multi-party Computingmentioning
confidence: 99%
See 1 more Smart Citation
“…Bonawitz et al [38] built a privacy-preserving FL framework, where SMC can securely aggregate the clients' parameters, and be robust to clients' exit. Xu et al [39] proposed a non-interactive and verifiable privacy protection FL model, and proposed a novel privacy gradient aggregation scheme using random matrix coding and secure two-party computation. These SMC models lead to huge communication cost because of multiple interactions involving in the learning process.…”
Section: Secure Multi-party Computingmentioning
confidence: 99%
“…Cause huge communication cost in multi-party interaction [36] Use SMC technology to achieve a robust reversible image watermarking scheme [37] Protect the private information of specific location in privacy-aware indoor location application [38] Can securely aggregate the clients' parameters in FL [39] Propose a novel privacy gradient aggregation scheme using random matrix coding and secure two-party computation Differential privacy [40] Achieve a balance between privacy loss and model performance while hiding customer contributions and private data during training.…”
Section: Reference Contributionmentioning
confidence: 99%
“…In Fastsecagg [147], by sacrificing some security, the authors substitute the standard Shamir secret sharing with a more efficient FFT-based multisecret sharing scheme. Alternatively, models can be shared between two servers as in [148], [149], [150], [151], or several servers such as [152], [153]. Some other works introduce a two-phase secret-sharing-based aggregation [154], [155].…”
Section: Mpc-based Aggregationmentioning
confidence: 99%
“…The sums of honest clients' inputs are same in both worlds. The encrypted data [x ̂𝑖] 𝑝𝑘 is indistinguishable due to the semantic security of DP encryption scheme [7]. The signature on indistinguishable encrypted data is also indistinguishable.…”
Section: Full Version Of Nipvs-fl Initialization Phase (Kgc)mentioning
confidence: 99%
“…This is a strong assumption that needs to be improved. Xu et al [7] avoid using the popular basic aggregation scheme which utilizing the pairwise additive masking method. They design a dual-servers architecture, and consider using random matrix coding to encrypt the gradient matrix.…”
Section: Introductionmentioning
confidence: 99%