scholarly journals Correction to “Sensitivity Analysis for matched pair analysis of binary data: From worst case to average case analysis,” by Raiden Hasegawa and Dylan Small; 73, 1424–1432, December 2017

Biometrics ◽  
2019 ◽  
Vol 75 (1) ◽  
pp. 355-355
Information ◽  
2020 ◽  
Vol 11 (11) ◽  
pp. 506
Author(s):  
Huda Chuangpishit ◽  
Konstantinos Georgiou ◽  
Preeti Sharma

The problem of evacuating two robots from the disk in the face-to-face model was first introduced by Czyzowicz et al. [DISC’2014], and has been extensively studied (along with many variations) ever since with respect to worst-case analysis. We initiate the study of the same problem with respect to average-case analysis, which is also equivalent to designing randomized algorithms for the problem. In particular, we introduce constrained optimization problem 2EvacF2F, in which one is trying to minimize the average-case cost of the evacuation algorithm given that the worst-case cost does not exceed w. The problem is of special interest with respect to practical applications, since a common objective in search-and-rescue operations is to minimize the average completion time, given that a certain worst-case threshold is not exceeded, e.g., for safety or limited energy reasons. Our main contribution is the design and analysis of families of new evacuation parameterized algorithms which can solve 2EvacF2F, for every w for which the problem is feasible. Notably, the worst-case analysis of the problem, since its introduction, has been relying on technical numerical, computer-assisted calculations, following tedious robot trajectory analysis. Part of our contribution is a novel systematic procedure, which given any evacuation algorithm, can derive its worst- and average-case performance in a clean and unified way.


2013 ◽  
Vol 10 (4) ◽  
pp. 1-38
Author(s):  
Dieter Schuller ◽  
Ulrich Lampe ◽  
Julian Eckert ◽  
Ralf Steinmetz ◽  
Stefan Schulte

The challenge of optimally selecting services from a set of functionally appropriate ones under Quality of Service (QoS) constraints – the Service Selection Problem – has been extensively addressed in the literature based on deterministic parameters. In practice, however, Quality of Service QoS parameters rather follow a stochastic distribution. In the work at hand, we present an integrated approach which addresses the Service Selection Problem for complex structured as well as unstructured workflows in conjunction with stochastic Quality of Service parameters. Accounting for penalty cost which accrue due to Quality of Service violations, we perform a worst-case analysis as opposed to an average-case analysis aiming at avoiding additional penalties. Although considering conservative computations, QoS violations due to stochastic QoS behavior still may occur resulting in potentially severe penalties. Our proposed approach reduces this impact of stochastic QoS behavior on total cost significantly.


2018 ◽  
Vol 25 (1) ◽  
pp. 123-134 ◽  
Author(s):  
Nodari Vakhania

AbstractThe computational complexity of an algorithm is traditionally measured for the worst and the average case. The worst-case estimation guarantees a certain worst-case behavior of a given algorithm, although it might be rough, since in “most instances” the algorithm may have a significantly better performance. The probabilistic average-case analysis claims to derive an average performance of an algorithm, say, for an “average instance” of the problem in question. That instance may be far away from the average of the problem instances arising in a given real-life application, and so the average case analysis would also provide a non-realistic estimation. We suggest that, in general, a wider use of probabilistic models for a more accurate estimation of the algorithm efficiency could be possible. For instance, the quality of the solutions delivered by an approximation algorithm may also be estimated in the “average” probabilistic case. Such an approach would deal with the estimation of the quality of the solutions delivered by the algorithm for the most common (for a given application) problem instances. As we illustrate, the probabilistic modeling can also be used to derive an accurate time complexity performance measure, distinct from the traditional probabilistic average-case time complexity measure. Such an approach could, in particular, be useful when the traditional average-case estimation is still rough or is not possible at all.


Sign in / Sign up

Export Citation Format

Share Document