scholarly journals Optimal Time-Space Trade-Offs for Non-Comparison-Based Sorting

2001 ◽  
Vol 8 (2) ◽  
Author(s):  
Rasmus Pagh ◽  
Jakob Pagter

<p>We study the fundamental problem of sorting n integers of w bits on a unit-cost RAM with word size w, and in particular consider the time-space trade-off (product of time and space in bits) for this problem. For comparison-based algorithms, the time-space complexity is known to be Theta(n^2). A result of Beame shows that the lower bound also holds for non-comparison-based algorithms, but no algorithm has met this for time below the comparison-based <br />Omega(n lg n) lower bound. </p><p>We show that if sorting within some time bound T~ is possible, then time T = O(T~ + n lg* n) can be achieved with high probability using space S = O(n^2/T + w), which is optimal. Given a deterministic priority queue using amortized<br />time t(n) per operation and space n^O(1), we provide a deterministic<br />algorithm sorting in time T = O(n (t(n) + lg* n)) with S = O(n^2/T+w). Both results require that w <= n^(1-Omega(1)).</p><p>Using existing priority queues and sorting algorithms, this implies<br />that we can deterministically sort time-space optimally in time Theta(T) for T >= n(lg lg n)^2, and with high probability for T >= n lg lg n.</p><p>Our results imply that recent lower bounds for deciding element distinctness in o(n lg n) time are nearly tight.</p>

1998 ◽  
Vol 5 (10) ◽  
Author(s):  
Jakob Pagter ◽  
Theis Rauhe

We study the fundamental problem of sorting in a sequential model of computation and in particular consider the time-space trade-off (product of time and space) for this problem.<br />Beame has shown a lower bound of  Omega(n^2) for this product leaving a gap of a logarithmic factor up to the previously best known upper bound of O(n^2 log n) due to Frederickson. Since then, no progress has been made towards tightening this gap.<br />The main contribution of this paper is a comparison based sorting algorithm which closes this gap by meeting the lower bound of Beame. The time-space product O(n^2) upper bound holds for the full range of space bounds between log n and n/log n. Hence in this range our algorithm is optimal for comparison based models as well as for the very powerful general models considered by Beame.


2000 ◽  
Vol 7 (11) ◽  
Author(s):  
Jakob Pagter

In this report we study the proof employed by Miklos Ajtai<br />[Determinism versus Non-Determinism for Linear Time RAMs<br />with Memory Restrictions, 31st Symposium on Theory of <br />Computation (STOC), 1999] when proving a non-trivial lower bound<br />in a general model of computation for the Hamming Distance<br />problem: given n elements: decide whether any two of them have<br />"small" Hamming distance. Specifically, Ajtai was able to show<br />that any R-way branching program deciding this problem using<br />time O(n) must use space Omega(n lg n).<br />We generalize Ajtai's original proof allowing us to prove a<br />time-space trade-off for deciding the Hamming Distance problem<br /> in the R-way branching program model for time between n<br />and alpha n lg n / lg lg n, for some suitable 0 < alpha < 1. In particular we prove<br />that if space is O(n^(1−epsilon)), then time is Omega(n lg n / lg lg n).


Author(s):  
S. Lakshmivarahan ◽  
Sudarshan K. Dhall

Ladner-Fischer [1980] was the first one to demonstrate size vs. depth trade-off in parallel prefix circuits. In this Chapter, a lower bound from Snir [1986] on (size + depth) for prefix circuits is first derived. Several designs of parallel prefix circuits with optimal (size + depth) trade-offs are described. In this section, we derive a lower bound from Snir [1986] on the sum of depth and size of a class of prefix circuits. Let x = { x1, x2, • • • , XN }, be a set of N variables. Let D be the domain over which elements of x take their values, and let FN = { f1, f 2, . . . , fN } be a set of functions satisfying the following conditions: (SR1) fi, depends only on the variables x1, x2 , • • • xi, and (SR2) For each variable xi, i = 1, 2, • • • , N, there exist values ai, in D, such that, the set of functions fj |xi = ai = fj(x1,x2, . . . , xi-1, xi = ai, xi+1, . . . , xj), j=1,2, . . . , N, contains a family of (N - 1) functions satisfying these conditions. The family FN satisfying these two conditions is called a self reducible family of functions. Clearly, fi (x1, x2, . . . , xi) = x1 o x2 o . . . o xi, 1 ≤ i ≤ N, satisfies these conditions. Following is an example of a self-reducible family of functions. An associative binary operation θ is said to be non-trivial if the following conditions are satisfied: (i) Let z = x θ y. Then z depends on both x and y, that is, θ is not a projection, nor a constant operation, and (ii) There exists a (right) unit element e of θ, such that, x = x θ e.


2016 ◽  
Vol 3 (5) ◽  
pp. 160087 ◽  
Author(s):  
Robert Francis Lynch

How to optimally allocate time, energy and investment in an effort to maximize one's reproductive success is a fundamental problem faced by all organisms. This effort is complicated when the production of each additional offspring dilutes the total resources available for parental investment. Although a quantity–quality trade-off between producing and investing in offspring has long been assumed in evolutionary biology, testing it directly in humans is difficult, partly owing to the long generation time of our species. Using data from an Icelandic genealogy (Íslendingabók) over two centuries, I address this issue and analyse the quantity–quality trade-off in humans. I demonstrate that the primary impact of parents on the fitness of their children is the result of resources and or investment, but not genes. This effect changes significantly across time, in response to environmental conditions. Overall, increasing reproduction has negative fitness consequences on offspring, such that each additional sibling reduces an individual's average lifespan and lifetime reproductive success. This analysis provides insights into the evolutionary conflict between producing and investing in children while also shedding light on some of the causes of the demographic transition.


Microbiology ◽  
2004 ◽  
Vol 150 (8) ◽  
pp. 2751-2760 ◽  
Author(s):  
Jan-Ulrich Kreft

The origin of altruism is a fundamental problem in evolution, and the maintenance of biodiversity is a fundamental problem in ecology. These two problems combine with the fundamental microbiological question of whether it is always advantageous for a unicellular organism to grow as fast as possible. The common basis for these three themes is a trade-off between growth rate and growth yield, which in turn is based on irreversible thermodynamics. The trade-off creates an evolutionary alternative between two strategies: high growth yield at low growth rate versus high growth rate at low growth yield. High growth yield at low growth rate is a case of an altruistic strategy because it increases the fitness of the group by using resources economically at the cost of decreased fitness, or growth rate, of the individual. The group-beneficial behaviour is advantageous in the long term, whereas the high growth rate strategy is advantageous in the short term. Coexistence of species requires differences between their niches, and niche space is typically divided into four ‘axes' (time, space, resources, predators). This neglects survival strategies based on cooperation, which extend the possibilities of coexistence, arguing for the inclusion of cooperation as the fifth ‘axis’. Here, individual-based model simulations show that spatial structure, as in, for example, biofilms, is necessary for the origin and maintenance of this ‘primitive’ altruistic strategy and that the common belief that growth rate but not yield decides the outcome of competition is based on chemostat models and experiments. This evolutionary perspective on life in biofilms can explain long-known biofilm characteristics, such as the structural organization into microcolonies, the often-observed lack of mixing among microcolonies, and the shedding of single cells, as promoting the origin and maintenance of the altruistic strategy. Whereas biofilms enrich altruists, enrichment cultures, microbiology's paradigm for isolating bacteria into pure culture, select for highest growth rate.


2012 ◽  
Vol 11 (3) ◽  
pp. 118-126 ◽  
Author(s):  
Olive Emil Wetter ◽  
Jürgen Wegge ◽  
Klaus Jonas ◽  
Klaus-Helmut Schmidt

In most work contexts, several performance goals coexist, and conflicts between them and trade-offs can occur. Our paper is the first to contrast a dual goal for speed and accuracy with a single goal for speed on the same task. The Sternberg paradigm (Experiment 1, n = 57) and the d2 test (Experiment 2, n = 19) were used as performance tasks. Speed measures and errors revealed in both experiments that dual as well as single goals increase performance by enhancing memory scanning. However, the single speed goal triggered a speed-accuracy trade-off, favoring speed over accuracy, whereas this was not the case with the dual goal. In difficult trials, dual goals slowed down scanning processes again so that errors could be prevented. This new finding is particularly relevant for security domains, where both aspects have to be managed simultaneously.


2019 ◽  
Author(s):  
Anna Katharina Spälti ◽  
Mark John Brandt ◽  
Marcel Zeelenberg

People often have to make trade-offs. We study three types of trade-offs: 1) "secular trade-offs" where no moral or sacred values are at stake, 2) "taboo trade-offs" where sacred values are pitted against financial gain, and 3) "tragic trade-offs" where sacred values are pitted against other sacred values. Previous research (Critcher et al., 2011; Tetlock et al., 2000) demonstrated that tragic and taboo trade-offs are not only evaluated by their outcomes, but are also evaluated based on the time it took to make the choice. We investigate two outstanding questions: 1) whether the effect of decision time differs for evaluations of decisions compared to decision makers and 2) whether moral contexts are unique in their ability to influence character evaluations through decision process information. In two experiments (total N = 1434) we find that decision time affects character evaluations, but not evaluations of the decision itself. There were no significant differences between tragic trade-offs and secular trade-offs, suggesting that the decisions structure may be more important in evaluations than moral context. Additionally, the magnitude of the effect of decision time shows us that decision time, may be of less practical use than expected. We thus urge, to take a closer examination of the processes underlying decision time and its perception.


2019 ◽  
Author(s):  
Kasper Van Mens ◽  
Joran Lokkerbol ◽  
Richard Janssen ◽  
Robert de Lange ◽  
Bea Tiemens

BACKGROUND It remains a challenge to predict which treatment will work for which patient in mental healthcare. OBJECTIVE In this study we compare machine algorithms to predict during treatment which patients will not benefit from brief mental health treatment and present trade-offs that must be considered before an algorithm can be used in clinical practice. METHODS Using an anonymized dataset containing routine outcome monitoring data from a mental healthcare organization in the Netherlands (n = 2,655), we applied three machine learning algorithms to predict treatment outcome. The algorithms were internally validated with cross-validation on a training sample (n = 1,860) and externally validated on an unseen test sample (n = 795). RESULTS The performance of the three algorithms did not significantly differ on the test set. With a default classification cut-off at 0.5 predicted probability, the extreme gradient boosting algorithm showed the highest positive predictive value (ppv) of 0.71(0.61 – 0.77) with a sensitivity of 0.35 (0.29 – 0.41) and area under the curve of 0.78. A trade-off can be made between ppv and sensitivity by choosing different cut-off probabilities. With a cut-off at 0.63, the ppv increased to 0.87 and the sensitivity dropped to 0.17. With a cut-off of at 0.38, the ppv decreased to 0.61 and the sensitivity increased to 0.57. CONCLUSIONS Machine learning can be used to predict treatment outcomes based on routine monitoring data.This allows practitioners to choose their own trade-off between being selective and more certain versus inclusive and less certain.


Sign in / Sign up

Export Citation Format

Share Document