2017 IEEE International Workshop on Measurement and Networking (M&N) 2017
DOI: 10.1109/iwmn.2017.8078365
|View full text |Cite
|
Sign up to set email alerts
|

Towards accurate detection of obfuscated web tracking

Abstract: Web tracking is currently recognized as one of the most important privacy threats on the Internet. Over the last years, many methodologies have been developed to uncover web trackers. Most of them are based on static code analysis and the use of predefined blacklists. However, our main hypothesis is that web tracking has started to use obfuscated programming, a transformation of code that renders previous detection methodologies ineffective and easy to evade. In this paper, we propose a new methodology based o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 10 publications
0
12
0
Order By: Relevance
“…Such approaches fail against adversaries who rotate domains quickly [39], proxy resources through trusted domains (e.g. the first party, CDNs) [20], or restructure or obfuscate JavaScript [51], among other common techniques.…”
Section: Introductionmentioning
confidence: 99%
“…Such approaches fail against adversaries who rotate domains quickly [39], proxy resources through trusted domains (e.g. the first party, CDNs) [20], or restructure or obfuscate JavaScript [51], among other common techniques.…”
Section: Introductionmentioning
confidence: 99%
“…Lastly, in [8], Le et al study the presence of obfuscation within web tracking, and specifically in the canvas fingerprinting method. They found obfuscation to be present, although still not very widely used.…”
Section: A Detection/classificationmentioning
confidence: 99%
“…IP Address, HTTP headers) or content properties require explicitly to download the content hosted by the URL to perform the inspection, increasing the risk and slowing down the process. Moreover, given that our method does not use code inspection, it is more robust against code minification and obfuscation, which is used by some JavaScript trackers to avoid detection [13].…”
Section: Related Workmentioning
confidence: 99%
“…The classification is done locally and exclusively based on the properties of the URL string, without the necessity of using any external features or communications. This approach has several advantages: (i) it does not require to download the source files, so malicious resources can be blocked in advance, (ii) it is more robust against the use of minification and code obfuscation techniques [13], (iii) it is more difficult to evade than traditional content-blockers, (iv) it can potentially generalize to new tracking domains, which would be missed by traditional content-blockers based on blacklists, and (v) it is lightweight enough to be included in a browser plugin.…”
Section: Introductionmentioning
confidence: 99%