Proceedings of the 18th International Conference on Mobile Systems, Applications, and Services 2020
DOI: 10.1145/3386901.3388946
|View full text |Cite
|
Sign up to set email alerts
|

DarkneTZ

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 91 publications
(34 citation statements)
references
References 35 publications
0
19
0
Order By: Relevance
“…TEE provides a promising hardware-based alternative for achieving secure computation with a CPUlevel performance, and thus bridges the gap between academic research and industrial adoption of secure computation. Trusted hardware provides a powerful abstraction for building secure systems and can be applied to many potential applications, such as machine learning algorithms [81,97,98,102,106,142], cloud computing [15,33,94,122,126,141], blockchain [18,22,87,159], network traffic analysis [47], scientific computation [130], data analytics [41,75,118,164,165], and privacy-preserving COVID-19 contact tracing [73].…”
Section: Our Contributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…TEE provides a promising hardware-based alternative for achieving secure computation with a CPUlevel performance, and thus bridges the gap between academic research and industrial adoption of secure computation. Trusted hardware provides a powerful abstraction for building secure systems and can be applied to many potential applications, such as machine learning algorithms [81,97,98,102,106,142], cloud computing [15,33,94,122,126,141], blockchain [18,22,87,159], network traffic analysis [47], scientific computation [130], data analytics [41,75,118,164,165], and privacy-preserving COVID-19 contact tracing [73].…”
Section: Our Contributionsmentioning
confidence: 99%
“…However, due to the partial in-enclave training, SecureFL results in 1.6%-23.6% overhead compared with a pure CPU-based training. Aiming to limit privacy leakages in FL, Mo et al presented a privacy-preserving federated learning (PPFL) framework [98] for mobile systems using TEE. PPFL adopted TEE to train each layer of the model until its convergence by leveraging the greedy layerwise.…”
Section: Research Statusmentioning
confidence: 99%
“…On the other hand, defenses against inversion attacks have seen a surge since the introduction of FL. DarkneTZ [31], PPFL [30] and GradSec [29] mitigate these attacks with the use of a TEE. By protecting sensitive parameters, activations and gradients inside the enclave memory, the risk of gradient leakage is alleviated and the white-box setting is effectively limited, thus weakening the threat.…”
Section: Related Workmentioning
confidence: 99%
“…In addition to data privacy, model privacy also raises attention (Mo et al, 2020), which means foundation models can be black-box models (Guidotti et al, 2018;Ljung, 2001). Little work paid attention to finetuning or optimizing in this field, but most related work focused on attacks (Yang et al, 2023b;c).…”
Section: Related Workmentioning
confidence: 99%