2006
DOI: 10.1145/1140103.1140295
|View full text |Cite
|
Sign up to set email alerts
|

Data streaming algorithms for estimating entropy of network traffic

Abstract: Using entropy of traffic distributions has been shown to aid a wide variety of network monitoring applications such as anomaly detection, clustering to reveal interesting patterns, and traffic classification. However, realizing this potential benefit in practice requires accurate algorithms that can operate on high-speed links, with low CPU and memory requirements. Estimating the entropy in a streaming model to enable such fine-grained traffic analysis has been a challenging problem. We give lower bounds for t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
113
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 129 publications
(119 citation statements)
references
References 29 publications
0
113
0
Order By: Relevance
“…Space requirement Naïve O(m log n) Lall et al 3 [16] O((1/ε) 2 log(1/δ) log m log n) Bhuvanagiri & Ganguly [3] O((1/ε 3 ) log(1/δ) log 5 m) Harvey et al [12] O((1/ε) 2 log(1/δ) log m log n log(mn)) available (see Table I) for estimating entropy, the case for conditional entropy turns out to be different. Indyk and McGregor [13] showed that estimating conditional entropy with ε-multiplicative error requires Ω(m) space, so that no non-trivial savings are possible over naïve computation even with randomized, approximation algorithms.…”
Section: Algorithmmentioning
confidence: 99%
See 4 more Smart Citations
“…Space requirement Naïve O(m log n) Lall et al 3 [16] O((1/ε) 2 log(1/δ) log m log n) Bhuvanagiri & Ganguly [3] O((1/ε 3 ) log(1/δ) log 5 m) Harvey et al [12] O((1/ε) 2 log(1/δ) log m log n log(mn)) available (see Table I) for estimating entropy, the case for conditional entropy turns out to be different. Indyk and McGregor [13] showed that estimating conditional entropy with ε-multiplicative error requires Ω(m) space, so that no non-trivial savings are possible over naïve computation even with randomized, approximation algorithms.…”
Section: Algorithmmentioning
confidence: 99%
“…We now detail the performance in conditional entropy estimation of the two algorithms we consider: the HSS algorithm and Algorithm 1 of Lall et al [16]. Our tests in this section were performed on Trace 1 and follow along the lines of the evaluations in [16].…”
Section: A Algorithm Performance In Conditional Entropy Estimationmentioning
confidence: 99%
See 3 more Smart Citations