TOWARDS POSSIBLE NON-EXTENSIVE THERMODYNAMICS OF ALGORITHMIC PROCESSING — STATISTICAL MECHANICS OF INSERTION SORT ALGORITHM

2008 ◽  
Vol 19 (09) ◽  
pp. 1443-1458 ◽  
Author(s):  
DOMINIK STRZAŁKA ◽  
FRANCISZEK GRABOWSKI

Tsallis entropy introduced in 1988 is considered to have obtained new possibilities to construct generalized thermodynamical basis for statistical physics expanding classical Boltzmann–Gibbs thermodynamics for nonequilibrium states. During the last two decades this q-generalized theory has been successfully applied to considerable amount of physically interesting complex phenomena. The authors would like to present a new view on the problem of algorithms computational complexity analysis by the example of the possible thermodynamical basis of the sorting process and its dynamical behavior. A classical approach to the analysis of the amount of resources needed for algorithmic computation is based on the assumption that the contact between the algorithm and the input data stream is a simple system, because only the worst-case time complexity is considered to minimize the dependency on specific instances. Meanwhile the article shows that this process can be governed by long-range dependencies with thermodynamical basis expressed by the specific shapes of probability distributions. The classical approach does not allow to describe all properties of processes (especially the dynamical behavior of algorithms) that can appear during the computer algorithmic processing even if one takes into account the average case analysis in computational complexity. The importance of this problem is still neglected especially if one realizes two important things. The first one: nowadays computer systems work also in an interactive mode and for better understanding of its possible behavior one needs a proper thermodynamical basis. The second one: computers from mathematical point of view are Turing machines but in reality they have physical implementations that need energy for processing and the problem of entropy production appears. That is why the thermodynamical analysis of the possible behavior of the simple insertion sort algorithm will be given here.

Author(s):  
Krzysztof A. Sikorski

In this chapter we consider the approximation of fixed points of noncontractive functions with respect to the absolute error criterion. In this case the functions may have multiple and/or whole manifolds of fixed points. We analyze methods based on sequential function evaluations as information. The simple iteration usually does not converge in this case, and the problem becomes much more difficult to solve. We prove that even in the two-dimensional case the problem has infinite worst case complexity. This means that no methods exist that solve the problem with arbitrarily small error tolerance for some “bad” functions. In the univariate case the problem is solvable, and a bisection envelope method is optimal. These results are in contrast with the solution under the residual error criterion. The problem then becomes solvable, although with exponential complexity, as outlined in the annotations. Therefore, simplicial and/or homotopy continuation and all methods based on function evaluations exhibit exponential worst case cost for solving the problem in the residual sense. These results indicate the need of average case analysis, since for many test functions the existing algorithms computed ε-approximations with polynomial in 1/ε cost.


Information ◽  
2020 ◽  
Vol 11 (11) ◽  
pp. 506
Author(s):  
Huda Chuangpishit ◽  
Konstantinos Georgiou ◽  
Preeti Sharma

The problem of evacuating two robots from the disk in the face-to-face model was first introduced by Czyzowicz et al. [DISC’2014], and has been extensively studied (along with many variations) ever since with respect to worst-case analysis. We initiate the study of the same problem with respect to average-case analysis, which is also equivalent to designing randomized algorithms for the problem. In particular, we introduce constrained optimization problem 2EvacF2F, in which one is trying to minimize the average-case cost of the evacuation algorithm given that the worst-case cost does not exceed w. The problem is of special interest with respect to practical applications, since a common objective in search-and-rescue operations is to minimize the average completion time, given that a certain worst-case threshold is not exceeded, e.g., for safety or limited energy reasons. Our main contribution is the design and analysis of families of new evacuation parameterized algorithms which can solve 2EvacF2F, for every w for which the problem is feasible. Notably, the worst-case analysis of the problem, since its introduction, has been relying on technical numerical, computer-assisted calculations, following tedious robot trajectory analysis. Part of our contribution is a novel systematic procedure, which given any evacuation algorithm, can derive its worst- and average-case performance in a clean and unified way.


2017 ◽  
Vol 2 (3) ◽  
Author(s):  
Silvio Franz ◽  
Giorgio Parisi ◽  
Maxime Sevelev ◽  
Pierfrancesco Urbani ◽  
Francesco Zamponi

Random constraint satisfaction problems (CSP) have been studied extensively using statistical physics techniques. They provide a benchmark to study average case scenarios instead of the worst case one. The interplay between statistical physics of disordered systems and computer science has brought new light into the realm of computational complexity theory, by introducing the notion of clustering of solutions, related to replica symmetry breaking. However, the class of problems in which clustering has been studied often involve discrete degrees of freedom: standard random CSPs are random (aka disordered Ising models) or random coloring problems (aka disordered Potts models). In this work we consider instead problems that involve continuous degrees of freedom. The simplest prototype of these problems is the perceptron. Here we discuss in detail the full phase diagram of the model. In the regions of parameter space where the problem is non-convex, leading to multiple disconnected clusters of solutions, the solution is critical at the SAT/UNSAT threshold and lies in the same universality class of the jamming transition of soft spheres. We show how the critical behavior at the satisfiability threshold emerges, and we compute the critical exponents associated to the approach to the transition from both the SAT and UNSAT phase. We conjecture that there is a large universality class of non-convex continuous CSPs whose SAT-UNSAT threshold is described by the same scaling solution.


Author(s):  
Adrijan Božinovski ◽  
George Tanev ◽  
Biljana Stojčevska ◽  
Veno Pačovski ◽  
Nevena Ackovska

This paper presents the time complexity analysis of the Binary Tree Roll algorithm. The time complexity is analyzed theoretically and the results are then confirmed empirically. The theoretical analysis consists of finding recurrence relations for the time complexity, and solving them using various methods. The empirical analysis consists of exhaustively testing all trees with given numbers of nodes  and counting the minimum and maximum steps necessary to complete the roll algorithm. The time complexity is shown, both theoretically and empirically, to be linear in the best case and quadratic in the worst case, whereas its average case is shown to be dominantly linear for trees with a relatively small number of nodes and dominantly quadratic otherwise.


Author(s):  
Adrijan Božinovski ◽  
George Tanev ◽  
Biljana Stojčevska ◽  
Veno Pačovski ◽  
Nevena Ackovska

This paper presents the space complexity analysis of the Binary Tree Roll algorithm. The space complexity is analyzed theoretically and the results are then confirmed empirically. The theoretical analysis consists of determining the amount of memory occupied during the execution of the algorithm and deriving functions of it, in terms of the number of nodes of the tree n, for the worst - and best-case scenarios. The empirical analysis of the space complexity consists of measuring the maximum and minimum amounts of memory occupied during the execution of the algorithm, for all binary tree topologies with the given number of nodes. The space complexity is shown, both theoretically and empirically, to be logarithmic in the best case and linear in the worst case, whereas its average case is shown to be dominantly logarithmic.


Author(s):  
A. Kalaivani ◽  
K. Swetha

Sorting is a technique which is used to arrange the data in specific order. A sorting technique is applied to rearrange the elements in numerical order as ascending order or descending order or for words in alphabetical order. In this paper, we propose an efficient sorting algorithm known as Enhanced Bidirectional Insertion Sorting algorithm which is developed from insertion sort concept. A comparative analysis is done for the proposed Enhanced Bidirectional Insertion Sort algorithm with the selection sort and insertion sort algorithms. When compared to insertion sort algorithm the proposed algorithm outperforms with less number of comparisons in worst case and average case computing time. The proposed algorithm works efficiently for duplicated elements which is the advanced improvement and the results are proved.


2013 ◽  
Vol 10 (4) ◽  
pp. 1-38
Author(s):  
Dieter Schuller ◽  
Ulrich Lampe ◽  
Julian Eckert ◽  
Ralf Steinmetz ◽  
Stefan Schulte

The challenge of optimally selecting services from a set of functionally appropriate ones under Quality of Service (QoS) constraints – the Service Selection Problem – has been extensively addressed in the literature based on deterministic parameters. In practice, however, Quality of Service QoS parameters rather follow a stochastic distribution. In the work at hand, we present an integrated approach which addresses the Service Selection Problem for complex structured as well as unstructured workflows in conjunction with stochastic Quality of Service parameters. Accounting for penalty cost which accrue due to Quality of Service violations, we perform a worst-case analysis as opposed to an average-case analysis aiming at avoiding additional penalties. Although considering conservative computations, QoS violations due to stochastic QoS behavior still may occur resulting in potentially severe penalties. Our proposed approach reduces this impact of stochastic QoS behavior on total cost significantly.


Sign in / Sign up

Export Citation Format

Share Document