Control charts for traffic intensity monitoring of Markovian multiserver queues

2019 ◽  
Vol 36 (1) ◽  
pp. 354-364 ◽  
Author(s):  
Frederico R. B. Cruz ◽  
Roberto C. Quinino ◽  
Linda L. Ho
2018 ◽  
Vol 2018 ◽  
pp. 1-7 ◽  
Author(s):  
Emilio Suyama ◽  
Roberto C. Quinino ◽  
Frederico R. B. Cruz

Estimators for the parameters of the Markovian multiserver queues are presented, from samples that are the number of clients in the system at arbitrary points and their sojourn times. As estimation in queues is a recognizably difficult inferential problem, this study focuses on the estimators for the arrival rate, the service rate, and the ratio of these two rates, which is known as the traffic intensity. Simulations are performed to verify the quality of the estimations for sample sizes up to 400. This research also relates notable new insights, for example, that the maximum likelihood estimator for the traffic intensity is equivalent to its moment estimator. Some limitations of the results are presented along with a detailed numerical example and topics for future developments in this research area.


2016 ◽  
Vol 35 (4) ◽  
pp. 536-559 ◽  
Author(s):  
Manuel Cabral Morais ◽  
António Pacheco

2019 ◽  
Vol 34 (1) ◽  
pp. 9-18
Author(s):  
Marta Santos ◽  
Manuel Cabral Morais ◽  
António Pacheco

Abstract The traffic intensity (ρ) is a vital parameter of queueing systems because it is a measure of the average occupancy of a server. Consequently, it influences their operational performance, namely queue lengths and waiting times. Moreover, since many computer, production and transportation systems are frequently modelled as queueing systems, it is crucial to use control charts to detect changes in ρ. In this paper, we pay particular attention to control charts meant to detect increases in the traffic intensity, namely: a short-memory chart based on the waiting time of the n-th arriving customer; two long-memory charts with more sophisticated control statistics, and the two cumulative sum (CUSUM) charts proposed by Chen and Zhou (2015). We confront the performances of these charts in terms of some run length related performance metrics and under different out-of-control scenarios. Extensive results are provided to give the quality control practitioner a concrete idea about the performance of these charts.


2018 ◽  
Vol 33 (1) ◽  
pp. 1-21 ◽  
Author(s):  
Marta Santos ◽  
Manuel Cabral Morais ◽  
António Pacheco

Abstract This paper describes the application of simple quality control charts to monitor the traffic intensity of single server queues, a still uncommon use of what is arguably the most successful statistical process control tool. These charts play a vital role in the detection of increases in the traffic intensity of single server queueing systems such as the {M/G/1} , {GI/M/1} and {GI/G/1} queues. The corresponding control statistics refer solely to a customer-arrival/departure epoch as opposed to several such epochs, thus they are termed short-memory charts. We compare the RL performance of those charts under three out-of-control scenarios referring to increases in the traffic intensity due to: a decrease in the service rate while the arrival rate remains unchanged; an increase in the arrival rate while the service rate is constant; an increase in the arrival rate accompanied by a proportional decrease in the service rate. These comparisons refer to a broad set of interarrival and service time distributions, namely exponential, Erlang, hyper-exponential, and hypo-exponential. Extensive results and striking illustrations are provided to give the quality control practitioner an idea of how these charts perform in practice.


Pflege ◽  
2013 ◽  
Vol 26 (2) ◽  
pp. 119-127 ◽  
Author(s):  
Jan Kottner ◽  
Armin Hauss
Keyword(s):  

Vergleichende Qualitätsmessungen und Beurteilungen spielen in der Pflege eine zunehmend wichtige Rolle. Qualitätskennzahlen sind von systematischen und zufälligen Fehlern beeinflusst. Eine Möglichkeit, mit zufälliger Variation in Kennzahlenvergleichen adäquat umzugehen, bietet die Theorie der Statistischen Prozesskontrolle (SPC). Im vorliegenden Beitrag werden Regelkarten (control charts) als Werkzeuge der SPC vorgestellt. Es handelt sich dabei um grafische Darstellungen von Qualitätskennzahlen im zeitlichen Verlauf. Attributive Merkmale können mithilfe von p-, u- und c-Regelkarten dargestellt werden. Es gibt eine Reihe von Regeln, mit denen spezielle Variationen (special cause variation) innerhalb des betrachteten Prozesses identifiziert werden können. Finden sich im Diagramm keine Hinweise auf nichtzufällige Variationen, geht man davon aus, dass sich der Prozess innerhalb «statistischer Kontrolle» befindet (common cause variation). Eine Abweichung eines Datenpunktes um mehr als drei Standardabweichungen vom Mittelwert aller vorliegenden Datenpunkte gilt als stärkstes Signal nicht zufallsbedingter Variation. Im Qualitätsmanagementkontext sind Regelkarten für die dynamische Messung von Prozessen und Ergebnissen und deren Beurteilungen traditionellen Mittelwerts- und Streuungsvergleichen überlegen.


2010 ◽  
Author(s):  
Thomas H. Stone ◽  
I. M. Jawahar ◽  
Ken Eastman ◽  
Gabi Eissa

2018 ◽  
Vol 9 (12) ◽  
pp. 1890-1897
Author(s):  
K. Rosaiah ◽  
B. Srinivasa Rao ◽  
J. Pratapa Reddy ◽  
C. Chinnamamba

Author(s):  
Mario Lesina ◽  
Lovorka Gotal Dmitrovic

The paper shows the relation among the number of small, medium and large companies in the leather and footwear industry in Croatia, as well as the relation among the number of their employees by means of the Spearman and Pearson correlation coefficient. The data were collected during 21 years. The warning zone and the risk zone were determined by means of the Statistical Process Control (SPC) for a certain number of small, medium and large companies in the leather and footwear industry in Croatia. Growth models, based on externalities, models based on research and development and the AK models were applied for the analysis of the obtained research results. The paper shows using the correlation coefficients that The relation between the number of large companies and their number of employees is the strongest, i.e. large companies have the best structured work places. The relation between the number of medium companies and the number of their employees is a bit weaker, while there is no relation in small companies. This is best described by growth models based on externalities, in which growth generates the increase in human capital, i.e. the growth of the level of knowledge and skills in the entire economy, but also deductively in companies on microeconomic level. These models also recognize the limit of accumulated knowledge after which growth may be expected. The absence of growth in small companies results from an insufficient level of human capital and failure to reach its limit level which could generate growth. According to Statistical Process Control (SPC), control charts, as well as regression models, it is clear that the most cost-effective investment is the investment into medium companies. The paper demonstrates the disadvantages in small, medium and large companies in the leather and footwear industry in Croatia. Small companies often emerge too quickly and disappear too easily owing to the employment of administrative staff instead of professional production staff. As the models emphasize, companies need to invest into their employees and employ good production staff. Investment and support to the medium companies not only strengthens the companies which have a well-arranged technological process and a good systematization of work places, but this also helps large companies, as there is a strong correlation between the number of medium and large companies.


Sign in / Sign up

Export Citation Format

Share Document