2016 Proceedings of the Thirteenth Workshop on Analytic Algorithmics and Combinatorics (ANALCO) 2015
DOI: 10.1137/1.9781611974324.1
|View full text |Cite
|
Sign up to set email alerts
|

More Analysis of Double Hashing for Balanced Allocations

Abstract: Abstract. With double hashing, for a key x, one generates two hash values f (x) and g(x), and then uses combinations (f (x) + ig(x)) mod n for i = 0, 1, 2, . . . to generate multiple hash values in the range [0, n − 1] from the initial two. For balanced allocations, keys are hashed into a hash table where each bucket can hold multiple keys, and each key is placed in the least loaded of d choices. It has been shown previously that asymptotically the performance of double hashing and fully random hashing is the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
2
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 26 publications
1
2
0
Order By: Relevance
“…, a + (d − 1)b modulo n. That is, the d choices are constrained to form an arithmetic progression; as such, only two random numbers modulo n are chosen to determine the choices, instead of d random numbers. It has been shown that, even with this more limited randomness, not only does the maximum load remain log log n/ log d + O(1) with high probability, but that the asymptotic fraction of bins of each constant load is the same as when the d choices are all perfectly random [21,22]. Similar results have been found for other data structures that use multiple hash functions, including Bloom filters [17] and cuckoo hashing [18,23].…”
Section: Arithmetic Progression Problemssupporting
confidence: 55%
“…, a + (d − 1)b modulo n. That is, the d choices are constrained to form an arithmetic progression; as such, only two random numbers modulo n are chosen to determine the choices, instead of d random numbers. It has been shown that, even with this more limited randomness, not only does the maximum load remain log log n/ log d + O(1) with high probability, but that the asymptotic fraction of bins of each constant load is the same as when the d choices are all perfectly random [21,22]. Similar results have been found for other data structures that use multiple hash functions, including Bloom filters [17] and cuckoo hashing [18,23].…”
Section: Arithmetic Progression Problemssupporting
confidence: 55%
“…At one extreme is uniform probing, where each probe is to a random location, thus sacrificing locality in the probe sequence [119]. There has been intensive work in analyzing variations on uniform probing, including in the presence of deletions [28,68,70,79,93,94,123,137].…”
Section: Related Workmentioning
confidence: 99%
“…Double hashing [21,60,75,85,86,93,143] is a classic alternative to uniform probing, in which a primary hash function determines the first probe and a secondary hash function determines jump size between indices in the probe sequence. Double hashing has been shown to have short probe sequences similar to that of uniform hashing, but like uniform probing, these short probe sequences come with a corresponding loss in locality.…”
Section: Related Workmentioning
confidence: 99%