scholarly journals On a variance related to the Ewens sampling formula

2011 ◽  
Vol 16 (4) ◽  
pp. 453-466 ◽  
Author(s):  
Eugenijus Manstavičius ◽  
Žydrūnas Žilinskas

A one-parameter multivariate distribution, called the Ewens sampling formula, was introduced in 1972 to model the mutation phenomenon in genetics. The case discussed in this note goes back to Lynch’s theorem in the random binary search tree theory. We examine an additive statistics, being a sum of dependent random variables, and find an upper bound of its variance in terms of the sum of variances of summands. The asymptotically best constant in this estimate is established as the dimension increases. The approach is based on approximation of the extremal eigenvalues of appropriate integral operators and matrices.

1992 ◽  
Vol 21 (409) ◽  
Author(s):  
Joan Boyar ◽  
Kim Skak Larsen

<p>In PODS'91, Nurmi and Soisalon-Soininen presented a new type of binary search tree for databases, which they call a <em>chromatic</em> tree. The aim is to improve runtime performance by allowing a greater degree of concurrency, which, in turn, is obtained by uncoupling updating from rebalancing. This also allows rebalancing to be postponed completely or partially until after peak working hours.</p><p>The advantages of the proposal of Nurmi and Soisalon-Soininen are quite significant, but there are definite problems with it. First, they give no explicit upper bound on the complexity of their algorithm. Second, some of their rebalancing operations can be applied many more times than necessary. Third, some of their operations, when removing one problem, create another.</p><p>We define a new set of rebalancing operations which we prove give rise to at most I_ log_2(N+1) _I - 1 $ rebalancing operations per insertion and at most I_ log_2 (N+1)_I - 2 rebalancing operations per deletion, where N is the maximum'size the tree could ever have, given its initial size and the number of insertions performed. Most of these rebalancing operations, in fact, do no restructuring; they simply move weights around. The number of operations which actually change the structure of the tree is at most one per update.</p>


Cryptography ◽  
2021 ◽  
Vol 5 (1) ◽  
pp. 4
Author(s):  
Bayan Alabdullah ◽  
Natalia Beloff ◽  
Martin White

Data security has become crucial to most enterprise and government applications due to the increasing amount of data generated, collected, and analyzed. Many algorithms have been developed to secure data storage and transmission. However, most existing solutions require multi-round functions to prevent differential and linear attacks. This results in longer execution times and greater memory consumption, which are not suitable for large datasets or delay-sensitive systems. To address these issues, this work proposes a novel algorithm that uses, on one hand, the reflection property of a balanced binary search tree data structure to minimize the overhead, and on the other hand, a dynamic offset to achieve a high security level. The performance and security of the proposed algorithm were compared to Advanced Encryption Standard and Data Encryption Standard symmetric encryption algorithms. The proposed algorithm achieved the lowest running time with comparable memory usage and satisfied the avalanche effect criterion with 50.1%. Furthermore, the randomness of the dynamic offset passed a series of National Institute of Standards and Technology (NIST) statistical tests.


2021 ◽  
Author(s):  
ZEGOUR Djamel Eddine

Abstract Today, Red-Black trees are becoming a popular data structure typically used to implement dictionaries, associative arrays, symbol tables within some compilers (C++, Java …) and many other systems. In this paper, we present an improvement of the delete algorithm of this kind of binary search tree. The proposed algorithm is very promising since it colors differently the tree while reducing color changes by a factor of about 29%. Moreover, the maintenance operations re-establishing Red-Black tree balance properties are reduced by a factor of about 11%. As a consequence, the proposed algorithm saves about 4% on running time when insert and delete operations are used together while conserving search performance of the standard algorithm.


Author(s):  
Chengwen Chris Wang ◽  
Daniel Sleator

2021 ◽  
pp. 143-150
Author(s):  
Tomohiro I ◽  
Robert W. Irving ◽  
Dominik Köppl ◽  
Lorna Love

Sign in / Sign up

Export Citation Format

Share Document