2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2012
DOI: 10.1109/allerton.2012.6483344
|View full text |Cite
|
Sign up to set email alerts
|

Peeling arguments and double hashing

Abstract: Abstract-The analysis of several algorithms and data structures can be reduced to the analysis of the following greedy "peeling" process: start with a random hypergraph; find a vertex of degree at most k, and remove it and all of its adjacent hyperedges from the graph; repeat until there is no suitable vertex. This specific process finds the k-core of a hypergraph, and variations on this theme have proven useful in analyzing for example decoding from low-density parity-check codes, several hash-based data stru… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
10
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 31 publications
1
10
0
Order By: Relevance
“…It has been shown that, even with this more limited randomness, not only does the maximum load remain log log n/ log d + O(1) with high probability, but that the asymptotic fraction of bins of each constant load is the same as when the d choices are all perfectly random [21,22]. Similar results have been found for other data structures that use multiple hash functions, including Bloom filters [17] and cuckoo hashing [18,23].…”
Section: Arithmetic Progression Problemssupporting
confidence: 56%
“…It has been shown that, even with this more limited randomness, not only does the maximum load remain log log n/ log d + O(1) with high probability, but that the asymptotic fraction of bins of each constant load is the same as when the d choices are all perfectly random [21,22]. Similar results have been found for other data structures that use multiple hash functions, including Bloom filters [17] and cuckoo hashing [18,23].…”
Section: Arithmetic Progression Problemssupporting
confidence: 56%
“…We have shown that the coupling argument of Lueker and Molodowitch can, with some modification of the standard random hashing process, yield results for double hashing with the balanced allocations framework. It is worth considering if this approach could be generalized further to handle other processes, most notably cuckoo hashing and peeling processes, where double hashing similarly seems to have the same performance as random hashing [15]. The challenge here for cuckoo hashing appears to be that the state change on entry of a new key is not limited to a single location; while only one cell in the hash table obtains a key, other cells become potential future recipients of the key if it should move, effectively changing the state of those cell.…”
Section: Discussionmentioning
confidence: 99%
“…Bachrach and Porat use double hashing in a variant of min-wise independent sketches [2]. Mitzenmacher and Thaler show suggestive preliminary results for double hashing for peeling algorithms and cuckoo hashing [15]. Leconte consideres double hashing in the context of the load threshold for cuckoo hashing, and shows that the thresholds are the same if one allows double hashing to fail to place o(n) keys [10].…”
Section: Introductionmentioning
confidence: 99%
“…However, if before throwing each ball we check d bins at random and throw the ball in the least loaded bin, then the most loaded bin will only hold log log n [20], which represents an exponential improvement. In the same setup, where bins have no limit capacity, double hashing achieves the same performances [21], [7] as fully random choices. However, in order to guaranty a worst-case constant look-up time, one cannot allow a key to refer to an unbounded number of items, which motivates the hashing setup presented in Section I.…”
Section: Previous Work On Performance Of Hashing Schemes For Loadmentioning
confidence: 95%