scholarly journals Um Limite Superior para a Complexidade do ShellSort

2018 ◽  
Author(s):  
Raquel M. Souza ◽  
Fabiano S. Oliveira ◽  
Paulo E. D. Pinto

The worst-case time complexity of the ShellSort algorithm is known only for some specific sequences (a sequence is a parameter of the algorithm). Relating the algorithm to the Frobenius number concept, we present an algorithm for determining the maximum number of comparisons for any sequence and array to be ordered. We apply this method together with the empirical determination of complexity to analyze several sequences whose worst case complexity are known. We show that the empirical approach succeeded in determining the same complexities which are analytically known and presented its results for sequences with unknown worst-case time complexity.

2020 ◽  
Vol 24 (23) ◽  
pp. 17609-17620 ◽  
Author(s):  
Yurii Nesterov

AbstractIn this paper, we suggest a new technique for soft clustering of multidimensional data. It is based on a new convex voting model, where each voter chooses a party with certain probability depending on the divergence between his/her preferences and the position of the party. The parties can react on the results of polls by changing their positions. We prove that under some natural assumptions this system has a unique fixed point, providing a unique solution for soft clustering. The solution of our model can be found either by imitation of the sequential elections, or by direct minimization of a convex potential function. In both cases, the methods converge linearly to the solution. We provide our methods with worst-case complexity bounds. To the best of our knowledge, these are the first polynomial-time complexity results in this field.


2014 ◽  
Vol 2014 ◽  
pp. 1-10
Author(s):  
Niraj Kumar Singh ◽  
Soubhik Chakraborty ◽  
Dheeresh Kumar Mallick

We present a new and improved worst case complexity model for quick sort as yworst(n,td)=b0+b1n2+g(n,td)+ɛ, where the LHS gives the worst case time complexity, n is the input size, td is the frequency of sample elements, and g(n,td) is a function of both the input size n and the parameter td. The rest of the terms arising due to linear regression have usual meanings. We claim this to be an improvement over the conventional model; namely, yworst(n)=b0+b1n+b2n2+ɛ, which stems from the worst case O(n2) complexity for this algorithm.


2015 ◽  
Vol 10 (4) ◽  
pp. 699-708 ◽  
Author(s):  
M. Dodangeh ◽  
L. N. Vicente ◽  
Z. Zhang

Author(s):  
Federico Della Croce ◽  
Bruno Escoffier ◽  
Marcin Kamiski ◽  
Vangelis Th. Paschos

2011 ◽  
Vol 03 (04) ◽  
pp. 457-471 ◽  
Author(s):  
B. BALAMOHAN ◽  
P. FLOCCHINI ◽  
A. MIRI ◽  
N. SANTORO

In a network environment supporting mobile entities (called robots or agents), a black hole is a harmful site that destroys any incoming entity without leaving any visible trace. The black-hole search problit is the task of a team of k > 1 mobile entities, starting from the same safe location and executing the same algorithm, to determine within finite time the location of the black hole. In this paper, we consider the black hole search problit in asynchronous ring networks of n nodes, and focus on time complexity. It is known that any algorithm for black-hole search in a ring requires at least 2(n - 2) time in the worst case. The best known algorithm achieves this bound with a team of n - 1 agents with an average time cost of 2(n - 2), equal to the worst case. In this paper, we first show how the same number of agents using 2 extra time units in the worst case, can solve the problit in only [Formula: see text] time on the average. We then prove that the optimal average case complexity of [Formula: see text] can be achieved without increasing the worst case using 2(n - 1) agents. Finally, we design an algorithm that achieves asymptotically optimal both worst and average case time complexities itploying an optimal team of k = 2 agents, thus improving on the earlier results that required O(n) agents.


Sign in / Sign up

Export Citation Format

Share Document