Optimizing Complex Service-Based Workflows for Stochastic QoS Parameters

2013 ◽  
Vol 10 (4) ◽  
pp. 1-38
Author(s):  
Dieter Schuller ◽  
Ulrich Lampe ◽  
Julian Eckert ◽  
Ralf Steinmetz ◽  
Stefan Schulte

The challenge of optimally selecting services from a set of functionally appropriate ones under Quality of Service (QoS) constraints – the Service Selection Problem – has been extensively addressed in the literature based on deterministic parameters. In practice, however, Quality of Service QoS parameters rather follow a stochastic distribution. In the work at hand, we present an integrated approach which addresses the Service Selection Problem for complex structured as well as unstructured workflows in conjunction with stochastic Quality of Service parameters. Accounting for penalty cost which accrue due to Quality of Service violations, we perform a worst-case analysis as opposed to an average-case analysis aiming at avoiding additional penalties. Although considering conservative computations, QoS violations due to stochastic QoS behavior still may occur resulting in potentially severe penalties. Our proposed approach reduces this impact of stochastic QoS behavior on total cost significantly.

Information ◽  
2020 ◽  
Vol 11 (11) ◽  
pp. 506
Author(s):  
Huda Chuangpishit ◽  
Konstantinos Georgiou ◽  
Preeti Sharma

The problem of evacuating two robots from the disk in the face-to-face model was first introduced by Czyzowicz et al. [DISC’2014], and has been extensively studied (along with many variations) ever since with respect to worst-case analysis. We initiate the study of the same problem with respect to average-case analysis, which is also equivalent to designing randomized algorithms for the problem. In particular, we introduce constrained optimization problem 2EvacF2F, in which one is trying to minimize the average-case cost of the evacuation algorithm given that the worst-case cost does not exceed w. The problem is of special interest with respect to practical applications, since a common objective in search-and-rescue operations is to minimize the average completion time, given that a certain worst-case threshold is not exceeded, e.g., for safety or limited energy reasons. Our main contribution is the design and analysis of families of new evacuation parameterized algorithms which can solve 2EvacF2F, for every w for which the problem is feasible. Notably, the worst-case analysis of the problem, since its introduction, has been relying on technical numerical, computer-assisted calculations, following tedious robot trajectory analysis. Part of our contribution is a novel systematic procedure, which given any evacuation algorithm, can derive its worst- and average-case performance in a clean and unified way.


2000 ◽  
Vol 8 (3) ◽  
pp. 291-309 ◽  
Author(s):  
Alberto Bertoni ◽  
Marco Carpentieri ◽  
Paola Campadelli ◽  
Giuliano Grossi

In this paper, a genetic model based on the operations of recombination and mutation is studied and applied to combinatorial optimization problems. Results are: The equations of the deterministic dynamics in the thermodynamic limit (infinite populations) are derived and, for a sufficiently small mutation rate, the attractors are characterized; A general approximation algorithm for combinatorial optimization problems is designed. The algorithm is applied to the Max Ek-Sat problem, and the quality of the solution is analyzed. It is proved to be optimal for k≥3 with respect to the worst case analysis; for Max E3-Sat the average case performances are experimentally compared with other optimization techniques.


2018 ◽  
Vol 25 (1) ◽  
pp. 123-134 ◽  
Author(s):  
Nodari Vakhania

AbstractThe computational complexity of an algorithm is traditionally measured for the worst and the average case. The worst-case estimation guarantees a certain worst-case behavior of a given algorithm, although it might be rough, since in “most instances” the algorithm may have a significantly better performance. The probabilistic average-case analysis claims to derive an average performance of an algorithm, say, for an “average instance” of the problem in question. That instance may be far away from the average of the problem instances arising in a given real-life application, and so the average case analysis would also provide a non-realistic estimation. We suggest that, in general, a wider use of probabilistic models for a more accurate estimation of the algorithm efficiency could be possible. For instance, the quality of the solutions delivered by an approximation algorithm may also be estimated in the “average” probabilistic case. Such an approach would deal with the estimation of the quality of the solutions delivered by the algorithm for the most common (for a given application) problem instances. As we illustrate, the probabilistic modeling can also be used to derive an accurate time complexity performance measure, distinct from the traditional probabilistic average-case time complexity measure. Such an approach could, in particular, be useful when the traditional average-case estimation is still rough or is not possible at all.


2013 ◽  
Vol 61 (10) ◽  
pp. 2486-2497 ◽  
Author(s):  
Fabian Lim ◽  
Vladimir Stojanovic

In another related work, U-statistics were used for non-asymptotic average-case analysis of random compressed sensing matrices. In this companion paper the same analytical tool is adopted differently-here we perform non-asymptotic worst-case analysis. Simple union bounds are a natural choice for worst-case analyses, however their tightness is an issue (and questioned in previous works). Here we focus on a theoretical U-statistical result, which potentially allows us to prove that these union bounds are tight. To our knowledge, this kind of (powerful) result is completely new in the context of CS. This general result applies to a wide variety of parameters, and is related to (Stein-Chen) Poisson approximation. In this paper, we consider i) restricted isometries, and ii) mutual coherence. For the bounded case, we show that -th order restricted isometry constants have tight union bounds, when the measurements m = O (k(1.5(+ log(n/k))). Here, we require the restricted isometries to grow linearly in , however we conjecture that this result can be improved to allow them to be fixed. Also, we show that mutual coherence (with the standard estimate √(4 log n)/m) have very tight union bounds. For coherence, the normalization complicates general discussion, and we consider only Gaussian and Bernoulli cases here.


Algorithmica ◽  
2021 ◽  
Author(s):  
Jie Zhang

AbstractApart from the principles and methodologies inherited from Economics and Game Theory, the studies in Algorithmic Mechanism Design typically employ the worst-case analysis and design of approximation schemes of Theoretical Computer Science. For instance, the approximation ratio, which is the canonical measure of evaluating how well an incentive-compatible mechanism approximately optimizes the objective, is defined in the worst-case sense. It compares the performance of the optimal mechanism against the performance of a truthful mechanism, for all possible inputs. In this paper, we take the average-case analysis approach, and tackle one of the primary motivating problems in Algorithmic Mechanism Design—the scheduling problem (Nisan and Ronen, in: Proceedings of the 31st annual ACM symposium on theory of computing (STOC), 1999). One version of this problem, which includes a verification component, is studied by Koutsoupias (Theory Comput Syst 54(3):375–387, 2014). It was shown that the problem has a tight approximation ratio bound of $$(n+1)/2$$ ( n + 1 ) / 2 for the single-task setting, where n is the number of machines. We show, however, when the costs of the machines to executing the task follow any independent and identical distribution, the average-case approximation ratio of the mechanism given by Koutsoupias (Theory Comput Syst 54(3):375–387, 2014) is upper bounded by a constant. This positive result asymptotically separates the average-case ratio from the worst-case ratio. It indicates that the optimal mechanism devised for a worst-case guarantee works well on average.


2019 ◽  
Vol 2019 ◽  
pp. 1-13 ◽  
Author(s):  
Jia-Zhen Huo ◽  
Yan-Ting Hou ◽  
Feng Chu ◽  
Jun-Kai He

This paper investigates joint decisions on airline network design and capacity allocation by integrating an uncapacitated single allocation p-hub median location problem into a revenue management problem. For the situation in which uncertain demand can be captured by a finite set of scenarios, we extend this integrated problem with average profit maximization to a combined average-case and worst-case analysis of this integration. We formulate this problem as a two-stage stochastic programming framework to maximize the profit, including the cost of installing the hubs and a weighted sum of average and worst case transportation cost and the revenue from tickets over all scenarios. This model can give flexible decisions by putting the emphasis on the importance of average and worst case profits. To solve this problem, a genetic algorithm is applied. Computational results demonstrate the outperformance of the proposed formulation.


Sign in / Sign up

Export Citation Format

Share Document