average case analysis
Recently Published Documents


TOTAL DOCUMENTS

209
(FIVE YEARS 10)

H-INDEX

20
(FIVE YEARS 1)

2021 ◽  
Vol vol. 22 no. 3, Computational... (Special issues) ◽  
Author(s):  
Katarzyna Grygiel ◽  
Isabella Larcher

In this paper we present an average-case analysis of closed lambda terms with restricted values of De Bruijn indices in the model where each occurrence of a variable contributes one to the size. Given a fixed integer k, a lambda term in which all De Bruijn indices are bounded by k has the following shape: It starts with k De Bruijn levels, forming the so-called hat of the term, to which some number of k-colored Motzkin trees are attached. By means of analytic combinatorics, we show that the size of this hat is constant on average and that the average number of De Bruijn levels of k-colored Motzkin trees of size n is asymptotically Θ(√ n). Combining these two facts, we conclude that the maximal non-empty De Bruijn level in a lambda term with restrictions on De Bruijn indices and of size n is, on average, also of order √ n. On this basis, we provide the average unary profile of such lambda terms.


Information ◽  
2020 ◽  
Vol 11 (11) ◽  
pp. 506
Author(s):  
Huda Chuangpishit ◽  
Konstantinos Georgiou ◽  
Preeti Sharma

The problem of evacuating two robots from the disk in the face-to-face model was first introduced by Czyzowicz et al. [DISC’2014], and has been extensively studied (along with many variations) ever since with respect to worst-case analysis. We initiate the study of the same problem with respect to average-case analysis, which is also equivalent to designing randomized algorithms for the problem. In particular, we introduce constrained optimization problem 2EvacF2F, in which one is trying to minimize the average-case cost of the evacuation algorithm given that the worst-case cost does not exceed w. The problem is of special interest with respect to practical applications, since a common objective in search-and-rescue operations is to minimize the average completion time, given that a certain worst-case threshold is not exceeded, e.g., for safety or limited energy reasons. Our main contribution is the design and analysis of families of new evacuation parameterized algorithms which can solve 2EvacF2F, for every w for which the problem is feasible. Notably, the worst-case analysis of the problem, since its introduction, has been relying on technical numerical, computer-assisted calculations, following tedious robot trajectory analysis. Part of our contribution is a novel systematic procedure, which given any evacuation algorithm, can derive its worst- and average-case performance in a clean and unified way.


Author(s):  
Iftikhar Ahmad ◽  
Marcus Pirron ◽  
Günter Schmidt

Since its introduction in $1985$, competitive analysis is a widely used tool for the performance measurement of online algorithms. Despite its simplicity and popularity, competitive analysis has its own set of drawbacks which lead to the development of other performance measures. However, these measures were seldom applied to problems in other domains. Recently Boyar et al. (A comparison of performance measures via online search, \textit{Theoretical Computer Science}, 2014) studied the online search problem using various performance analysis measures for non-preemptive algorithms. We extend the work by considering preemptive \textit{threat-based} algorithms and evaluate it using competitive analysis, bijective analysis, average case and relative interval analysis. For competitive analysis, and average case analysis, our findings are in contrast with that of Boyar et al., whereas for bijective and relative interval analysis our findings complement that of Boyar et al.


2019 ◽  
Vol 29 (6) ◽  
pp. 1335-1351 ◽  
Author(s):  
C. J. Oates ◽  
T. J. Sullivan

Abstract This article attempts to place the emergence of probabilistic numerics as a mathematical–statistical research field within its historical context and to explore how its gradual development can be related both to applications and to a modern formal treatment. We highlight in particular the parallel contributions of Sul$$'$$′din and Larkin in the 1960s and how their pioneering early ideas have reached a degree of maturity in the intervening period, mediated by paradigms such as average-case analysis and information-based complexity. We provide a subjective assessment of the state of research in probabilistic numerics and highlight some difficulties to be addressed by future works.


Author(s):  
Yansong Gao ◽  
Jie Zhang

The fundamental assignment problem is in search of welfare maximization mechanisms to allocate items to agents when the private preferences over indivisible items are provided by self-interested agents. The mainstream mechanism \textit{Random Priority} is asymptotically the best mechanism for this purpose, when comparing its welfare  to the optimal social welfare using the canonical \textit{worst-case approximation ratio}.  Surprisingly, the efficiency loss indicated by the worst-case ratio does not have a constant bound \cite{FFZ:14}.Recently, \cite{DBLP:conf/mfcs/DengG017} shows that when the agents' preferences are drawn from a uniform distribution, its \textit{average-case approximation ratio} is upper bounded by 3.718. They left it as an open question of whether a constant ratio holds for general scenarios. In this paper, we offer an affirmative answer to this question by showing that the ratio is bounded by $1/\mu$ when the preference values are independent and identically distributed random variables, where $\mu$ is the expectation of the value distribution. This upper bound improves the results in \cite{DBLP:conf/mfcs/DengG017} for the Uniform distribution as well. Moreover, under mild conditions, the ratio has a \textit{constant} bound for any independent  random values. En route to these results, we develop powerful tools to show the insights that for most valuation inputs, the efficiency loss is small.


Sign in / Sign up

Export Citation Format

Share Document