2016 IEEE/ACM 24th International Symposium on Quality of Service (IWQoS) 2016
DOI: 10.1109/iwqos.2016.7590444
|View full text |Cite
|
Sign up to set email alerts
|

Your trajectory privacy can be breached even if you walk in groups

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…In the example of Muslim taxi drivers mentioned above, the attacker inferred an attribute: the victims' religion, even though they did not identify anyone's trajectory. Sui et al [102] observe that 40% of the records that cannot be immediately identified in their data and seem anonymous were instead homogeneous and directly disclose the shared attribute.…”
Section: Attribute Disclosurementioning
confidence: 99%
“…In the example of Muslim taxi drivers mentioned above, the attacker inferred an attribute: the victims' religion, even though they did not identify anyone's trajectory. Sui et al [102] observe that 40% of the records that cannot be immediately identified in their data and seem anonymous were instead homogeneous and directly disclose the shared attribute.…”
Section: Attribute Disclosurementioning
confidence: 99%
“…A more sophisticated attack called semantic attack was proposed by Sui et al [12] in 2016. In this threat model, the attacker can query the semantic information from the trajectory data and infer the victim's behavior when combining POI distribution on the map.…”
Section: Privacy Risks On Trajectory Datamentioning
confidence: 99%
“…e generalization method is less effective in providing suitable protection for user privacy in the face of single trajectory data with sensitive attributes. Sui et al [13] analyzed the risk of trajectory data in a campus with a single trajectory diversity and provided quantitative proofs of the low diversity risk in the data. eir experiments demonstrate that, in the case of low user privacy diversity, personalized privacy protection methods are needed to compensate for the risk of privacy leakage caused by the lack of diversity.…”
Section: Related Workmentioning
confidence: 99%