scholarly journals FP/FIFO scheduling: coexistence of deterministic and probabilistic QoS guarantees

2007 ◽  
Vol Vol. 9 no. 1 (Distributed Computing and...) ◽  
Author(s):  
Pascale Minet ◽  
Steven Martin ◽  
Leila Azouz Saidane ◽  
Skander Azzaz

Distributed Computing and Networking International audience In this paper, we focus on applications having quantitative QoS (Quality of Service) requirements on their end-to-end response time (or jitter). We propose a solution allowing the coexistence of two types of quantitative QoS garantees, deterministic and probabilistic, while providing a high resource utilization. Our solution combines the advantages of the deterministic approach and the probabilistic one. The deterministic approach is based on a worst case analysis. The probabilistic approach uses a mathematical model to obtain the probability that the response time exceeds a given value. We assume that flows are scheduled according to non-preemptive FP/FIFO. The packet with the highest fixed priority is scheduled first. If two packets share the same priority, the packet arrived first is scheduled first. We make no particular assumption concerning the flow priority and the nature of the QoS guarantee requested by the flow. An admission control derived from these results is then proposed, allowing each flow to receive a quantitative QoS guarantee adapted to its QoS requirements. An example illustrates the merits of the coexistence of deterministic and probabilistic QoS guarantees.

Author(s):  
P. M. Martino ◽  
G. A. Gabriele

Abstract The proper selection of tolerances is an important part of mechanical design that can have a significant impact on the cost and quality of the final product. Yet, despite their importance, current techniques for tolerance design are rather primitive and often based on experience and trial and error. Better tolerance design methods have been proposed but are seldom used because of the difficulty in formulating the necessary design equations for practical problems. In this paper we propose a technique for the automatic formulation of the design equations, or design functions, which is based on the use of solid models and variational geometry. A prototype system has been developed which can model conventional and statistical tolernaces, and a limited set of geometric tolerances. The prototype system is limited to the modeling of single parts, but can perform both a worst case analysis and a statistical analysis. Results on several simple parts with known characteristics are presented which demonstrate the accuracy of the system and the types of analysis it can perform. The paper concludes with a discussion of extensions to the prototype system to a broader range of geometry and the handling of assemblies.


2013 ◽  
Vol 10 (4) ◽  
pp. 1-38
Author(s):  
Dieter Schuller ◽  
Ulrich Lampe ◽  
Julian Eckert ◽  
Ralf Steinmetz ◽  
Stefan Schulte

The challenge of optimally selecting services from a set of functionally appropriate ones under Quality of Service (QoS) constraints – the Service Selection Problem – has been extensively addressed in the literature based on deterministic parameters. In practice, however, Quality of Service QoS parameters rather follow a stochastic distribution. In the work at hand, we present an integrated approach which addresses the Service Selection Problem for complex structured as well as unstructured workflows in conjunction with stochastic Quality of Service parameters. Accounting for penalty cost which accrue due to Quality of Service violations, we perform a worst-case analysis as opposed to an average-case analysis aiming at avoiding additional penalties. Although considering conservative computations, QoS violations due to stochastic QoS behavior still may occur resulting in potentially severe penalties. Our proposed approach reduces this impact of stochastic QoS behavior on total cost significantly.


2000 ◽  
Vol 8 (3) ◽  
pp. 291-309 ◽  
Author(s):  
Alberto Bertoni ◽  
Marco Carpentieri ◽  
Paola Campadelli ◽  
Giuliano Grossi

In this paper, a genetic model based on the operations of recombination and mutation is studied and applied to combinatorial optimization problems. Results are: The equations of the deterministic dynamics in the thermodynamic limit (infinite populations) are derived and, for a sufficiently small mutation rate, the attractors are characterized; A general approximation algorithm for combinatorial optimization problems is designed. The algorithm is applied to the Max Ek-Sat problem, and the quality of the solution is analyzed. It is proved to be optimal for k≥3 with respect to the worst case analysis; for Max E3-Sat the average case performances are experimentally compared with other optimization techniques.


1990 ◽  
Vol 112 (2) ◽  
pp. 113-121 ◽  
Author(s):  
Woo-Jong Lee ◽  
T. C. Woo

Tolerance, representing a permissible variation of a dimension in an engineering drawing, is synthesized by considering assembly stack-up conditions based on manufacturing cost minimization. A random variable and its standard deviation are associated with a dimension and its tolerance. This probabilistic approach makes it possible to perform trade-off between performance and tolerance rather than the worst case analysis as it is commonly practiced. Tolerance (stack-up) analysis, as an inner loop in the overall algorithm for tolerance synthesis, is performed by approximating the volume under the multivariate probability density function constrained by nonlinear stack-up conditions with a convex polytope. This approximation makes use of the notion of reliability index [10] in structural safety. Consequently, the probabilistic optimization problem for tolerance synthesis is simplified into a deterministic nonlinear programming problem. An algorithm is then developed and is proven to converge to the global optimum through an investigation of the monotonic relations among tolerance, the reliability index, and cost. Examples from the implementation of the algorithm are given.


2005 ◽  
Vol 52 (1) ◽  
pp. 3-10 ◽  
Author(s):  
K. Lawson

PurposeThis paper compares and contrasts two approaches to the treatment of pipeline corrosion “risk” – the probabilistic approach and the more traditional, deterministic approach. The paper aims to discuss the merits and potential pitfalls of each approach.Design/methodology/approachProvides an outline of each approach. The probabilistic approach to the assessment of pipeline corrosion risks deals with many of the uncertainties that are common to the data employed and those with regard to the predictive models that are used also. Rather than considering each input parameter as an average value the approach considers the inputs as a series of probability density functions, the collective use during the assessment of risk yields a risk profile that is quantified on the basis of uncertain data. This approach differs from the traditional deterministic assessment in that the output yields a curve that shows how the “risk” of failure increases with time. The pipeline operator simply chooses the level of risk that is acceptable and then devises a strategy to deal with those risks. The traditional (deterministic) approach merely segments the output risks as either “high”, “medium” or “low”; a strategy for managing is devised based on the selection of an appropriate time interval to allow a reasonable prospect of detecting deterioration before the pipeline corrosion allowance is exceeded, or no longer complies with code. Applies both approaches to the case of a 16.1 km long, 14 in. main export line in the North Sea.FindingsThe deterministic assessment yielded a worst‐case failure probability of “medium” with a corresponding consequence of “high”; classifications that are clearly subjective. The probabilistic assessments quantified pipeline failure probabilities, although it is important to note that more effort was required when performing such an assessment. Using target probabilities for “high” and “normal” consequence pipeline segments, indications were that between 8.5 and 13 years was the time period for which the target (predicted) failure probabilities would be reached, again depending on how effective corrosion mitigation activities are in practice. Basing pipeline inspections in particular on the outputs from the deterministic assessment would therefore be conservative in this instance; but this may not necessarily always be so. That the probabilistic assessment indicates that inspections justifiably may be extended beyond that suggested by the deterministic assessment is a clear benefit, in that it affords the opportunity to defer expenditure on pipeline inspections to a later date, but it may be the case that the converse may be required. It may be argued therefore, that probabilistic assessment provides a superior basis for driving pipeline corrosion management activities given that the approach deals with the uncertainties in the basic input data.Originality/valueA probabilistic assessment approach that effectively mirrors pipeline operations, provides a superior basis upon which to manage risk and would therefore likely maximize both safety and business performance.


Author(s):  
IMED KACEM

In this paper, we deal with the flexible job shop scheduling problem. We propose an efficient heuristic method for solving the assignment problem. Indeed, we propose a worst case analysis to evaluate the performance of such a heuristic. The second specificity of the problem studied is the sequencing property. Our approach consists in the application of an evolutionary algorithm based on a set of adapted operators to solve the sequencing step. Some lower bounds for the problem (previously proposed in Ref. 1) will be used in order to evaluate the quality of our method and the solutions according to the different criteria.


2019 ◽  
Vol 25 (4) ◽  
pp. 331-344
Author(s):  
Milad Ebrahimi ◽  
Hamidreza Kazemi ◽  
Majid Ehteshami ◽  
Thomas D. Rockaway

ABSTRACT This study explores using probabilistic and deterministic approaches for evaluating the quality of groundwater resources. The proposed methodology first used the probabilistic approach, which included multivariate statistical analysis, to classify the groundwater's physiochemical characteristics. Then, building on the obtained results, the deterministic approach, which included hydrochemistry analyses, was applied for comprehensive assessment of groundwater quality for different applications. To present this multidisciplinary approach, a basin located in an arid region was studied. Considering the results from correlation and principal component analyses, along with hierarchical Q-mode cluster analysis, chloride salts dissolution was identified within the aquifer. Further application of the deterministic approach revealed degradation of groundwater quality throughout the basin, possibly due to the saltwater intrusion. By developing the water quality index and a multi-hazard risk assessment methodology, the suitability of groundwater for human consumption and irrigation purposes was assessed. The obtained results were compared with two other studies conducted on aquifers under similar arid climate conditions. This comparison indicated that quality of groundwater resources within arid regions is prone to degradation from salinization. The combined consideration of probabilistic and deterministic approaches provided an effective means for comprehensive evaluation of groundwater quality across different aquifers or within one.


2005 ◽  
Vol 127 (4) ◽  
pp. 404-413 ◽  
Author(s):  
Roland S. Muwanga ◽  
Sri Sreekanth ◽  
Daniel Grigore ◽  
Ricardo Trindade ◽  
Terry Lucas

A probabilistic approach to the thermal design and analysis of cooled turbine blades is presented. Various factors that affect the probabilistic performance of the blade thermal design are grouped into categories and a select number of factors known to be significant, for which the variability could be assessed are modeled as random variables. The variability data for these random variables were generated from separate Monte Carlo simulations (MCS) of the combustor and the upstream stator and secondary air system. The oxidation life of the blade is used as a measure to evaluate the thermal design as well as to evaluate validity of the methods. Two approaches have been explored to simulate blade row life variability and compare it with the field data. Field data from several engine removals are used for investigating the approach. Additionally a response surface approximation technique has been explored to expedite the simulation process. The results indicate that the conventional approach of a worst-case analysis is overly conservative and analysis based on nominal values could be very optimistic. The potential of a probabilistic approach in predicting the actual variability of the blade row life is clearly evident in the results. However, the results show that, in order to predict the blade row life variability adequately, it is important to model the operating condition variability. The probabilistic techniques such as MCS could become very practical when approximation techniques such as response surface modeling are used to represent the analytical model.


Sign in / Sign up

Export Citation Format

Share Document