scholarly journals Backyard Cuckoo Hashing: Constant Worst-Case Operations with a Succinct Representation

Author(s):  
Yuriy Arbitman ◽  
Moni Naor ◽  
Gil Segev
Keyword(s):  
2010 ◽  
Vol Vol. 12 no. 3 (Analysis of Algorithms) ◽  
Author(s):  
Reinhard Kutzelnigg

Analysis of Algorithms International audience Cuckoo hashing is a hash table data structure offering constant access time, even in the worst case. As a drawback, the construction fails with small, but practically significant probability. However, Kirsch et al. (2008) showed that a constant-sized additional memory, the so called stash, is sufficient to reduce the failure rate drastically. But so far, using a modified insertion procedure that demands additional running time to look for an admissible key is required. As a major contribution of this paper, we show that the same bounds on the failure probability hold even without this search process and thus, the performance increases. Second, we extend the analysis to simplified cuckoo hashing, a variant of the original algorithm offering increased performance. Further, we derive some explicit asymptotic approximations concerning the number of usual resp. bipartite graphs related to the data structures. Using these results, we obtain much more precise asymptotic expansions of the success rate. These calculations are based on a generating function approach and applying the saddle point method. Finally, we provide numerical results to support the theoretical analysis.


2001 ◽  
Vol 8 (32) ◽  
Author(s):  
Rasmus Pagh ◽  
Flemming Friche Rodler

We present a simple and efficient dictionary with worst case constant lookup time, equaling the theoretical performance of the classic dynamic perfect hashing scheme of Dietzfelbinger et al. (<em>Dynamic perfect hashing: Upper and lower bounds. SIAM J. Comput., 23(4):738-761, 1994</em>). The space usage is similar to that of binary search trees, i.e., three words per key on average. The practicality of the scheme is backed by extensive experiments and comparisons with known methods, showing it to be quite competitive also in the average case.


Author(s):  
J.D. Geller ◽  
C.R. Herrington

The minimum magnification for which an image can be acquired is determined by the design and implementation of the electron optical column and the scanning and display electronics. It is also a function of the working distance and, possibly, the accelerating voltage. For secondary and backscattered electron images there are usually no other limiting factors. However, for x-ray maps there are further considerations. The energy-dispersive x-ray spectrometers (EDS) have a much larger solid angle of detection that for WDS. They also do not suffer from Bragg’s Law focusing effects which limit the angular range and focusing distance from the diffracting crystal. In practical terms EDS maps can be acquired at the lowest magnification of the SEM, assuming the collimator does not cutoff the x-ray signal. For WDS the focusing properties of the crystal limits the angular range of acceptance of the incident x-radiation. The range is dependent upon the 2d spacing of the crystal, with the acceptance angle increasing with 2d spacing. The natural line width of the x-ray also plays a role. For the metal layered crystals used to diffract soft x-rays, such as Be - O, the minimum magnification is approximately 100X. In the worst case, for the LEF crystal which diffracts Ti - Zn, ˜1000X is the minimum.


2008 ◽  
Author(s):  
Sonia Savelli ◽  
Susan Joslyn ◽  
Limor Nadav-Greenberg ◽  
Queena Chen

Author(s):  
Akira YAMAWAKI ◽  
Hiroshi KAMABE ◽  
Shan LU
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document