2019 IEEE International Workshop on Information Forensics and Security (WIFS) 2019
DOI: 10.1109/wifs47025.2019.9035097
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Aware Location Sharing with Deep Reinforcement Learning

Abstract: Internet of things (IoT) devices are becoming increasingly popular thanks to many new services and applications they offer. However, in addition to their many benefits, they raise privacy concerns since they share fine-grained time-series user data with untrusted third parties. In this work, we study the privacy-utility trade-off (PUT) in time-series data sharing. Existing approaches to PUT mainly focus on a single data point; however, temporal correlations in time-series data introduce new challenges. Methods… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 37 publications
(69 reference statements)
0
6
0
Order By: Relevance
“…Among those that consider temporal correlations, most existing works focus on the privacy of the time-series measurements rather than hiding latent sensitive attributes [9], [10], [26]- [30]. In the location sharing scenario, sensitive information is the time-series data itself and the utility loss can be measured by data distortion, whereas in many applications, the user might be interested in hiding an underlying sensitive hypothesis.…”
Section: A Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Among those that consider temporal correlations, most existing works focus on the privacy of the time-series measurements rather than hiding latent sensitive attributes [9], [10], [26]- [30]. In the location sharing scenario, sensitive information is the time-series data itself and the utility loss can be measured by data distortion, whereas in many applications, the user might be interested in hiding an underlying sensitive hypothesis.…”
Section: A Related Workmentioning
confidence: 99%
“…Differential privacy (DP), k-anonymity, information theoretic metrics and the SP's error probability are commonly used as privacy measures [8], [9], [11], [15], [20]- [31], [38]. By definition, DP prevents the SP from inferring the current data of the user, even if the SP has the knowledge of all the remaining data points.…”
Section: A Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Privacy is an important concern for the adoption of many IoT services, and there is a growing demand from consumers to keep their personal information private. Privacy has been widely studied in the literature [1][2][3][4][5][6][7][8][9][10], and a vast number of privacy measures have been introduced, including differential privacy [1], mutual information (MI) [2][3][4][5][6][7][8], total variation distance [11], maximal leakage [12,13], and guessing leakage [14], to count a few.…”
Section: Introductionmentioning
confidence: 99%
“…Privacy is an important concern for the adoption of many IoT services, and there is a growing demand from consumers to keep their personal information private. Privacy has been widely studied in the literature [1][2][3][4][5][6][7][8][9][10], and a vast number of privacy measures have been introduced, including differential privacy [1], mutual information (MI) [2][3][4][5][6][7][8], total variation distance [11], maximal leakage [12,13], and guessing leakage [14], to count a few.…”
Section: Introductionmentioning
confidence: 99%