interarrival time
Recently Published Documents


TOTAL DOCUMENTS

138
(FIVE YEARS 8)

H-INDEX

18
(FIVE YEARS 1)

Author(s):  
Sourav Das ◽  
Nitin Awathare ◽  
Ling Ren ◽  
Vinay J. Ribeiro ◽  
Umesh Bellur

Proof-of-Work (PoW) based blockchains typically allocate only a tiny fraction (e.g., less than 1% for Ethereum) of the average interarrival time (I) between blocks for validating smart contracts present in transactions. In such systems, block validation and PoW mining are typically performed sequentially, the former by CPUs and the latter by ASICs. A trivial increase in validation time (τ) introduces the popularly known Verifier's Dilemma, and as we demonstrate, causes more forking and hurts fairness. Large τ also reduces the tolerance for safety against a Byzantine adversary. Solutions that offload validation to a set of non-chain nodes (a.k.a. off-chain approaches) suffer from trust and performance issues that are non-trivial to resolve. In this paper, we present Tuxedo, the first on-chain protocol to theoretically scale τ/I ≈1 in PoW blockchains. The key innovation in Tuxedo is to perform CPU-based block processing in parallel to ASIC mining. We achieve this by allowing miners to delay validation of transactions in a block by up to ζ blocks, where ζ is a system parameter. We perform security analysis of Tuxedo considering all possible adversarial strategies in a synchronous network with maximum end-to-end delay Δ and demonstrate that Tuxedo achieves security equivalent to known results for longest chain PoW Nakamoto consensus. Our prototype implementation of Tuxedo atop Ethereum demonstrates that it can scale τ without suffering the harmful effects of naive scaling up of τ/I in existing blockchains


2021 ◽  
Vol 23 (4) ◽  
pp. 627-635
Author(s):  
Hao Lyu ◽  
Shuai Wang ◽  
Xiaowen Zhang ◽  
Zaiyou Yang ◽  
Michael Pecht

In this paper, a system reliability model subject to Dependent Competing Failure Processes (DCFP) with phase-type (PH) distribution considering changing degradation rate is proposed. When the sum of continuous degradation and sudden degradation exceeds the soft failure threshold, soft failure occurs. The interarrival time between two successive shocks and total number of shocks before hard failure occurring follow the continuous PH distribution and discrete PH distribution, respectively. The hard failure reliability is calculated using the PH distribution survival function. Due to the shock on soft failure process, the degradation rate of soft failure will increase. When the number of shocks reaches a specific value, degradation rate changes. The hard failure is calculated by the extreme shock model, cumulative shock model, and run shock model, respectively. The closed-form reliability function is derived combining with the hard and soft failure reliability model. Finally, a Micro-Electro-Mechanical System (MEMS) demonstrates the effectiveness of the proposed model.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Hakan Özaktaş ◽  
Nureddin Kırkavak ◽  
Ayşe Nilay Alpay

Average waiting time is considered as one of the basic performance indicators for a bottleneck zone on a route for commuter traffic. It turns out that the average waiting time in a queue remains paradoxically unchanged regardless of how fast the queue dissolves for a single bottleneck problem. In this study, the paradox is verified theoretically for the deterministic case with constant arrival and departure rates. Consistent results with the deterministic case have also been obtained by simulation runs for which vehicle interarrival time is a random variable. Results are tabulated for interarrival times which have uniform, triangular, normal, and exponential distributions along with a statistical verification of the average waiting time paradox.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-24
Author(s):  
Wei Liang Quek ◽  
Ning Ning Chung ◽  
Vee-Liem Saw ◽  
Lock Yue Chew

In this paper, we propose an empirically based Monte Carlo bus-network (EMB) model as a test bed to simulate intervention strategies to overcome the inefficiencies of bus bunching. The EMB model is an agent-based model which utilizes the positional and temporal data of the buses obtained from the Global Positioning System (GPS) to constitute (1) a set of empirical velocity distributions of the buses and (2) a set of exponential distributions of interarrival time of passengers at the bus stops. Monte Carlo sampling is then performed on these two derived probability distributions to yield the stochastic dynamics of both the buses’ motion and passengers’ arrival. Our EMB model is generic and can be applied to any real-world bus network system. In particular, we have validated the model against the Nanyang Technological University’s Shuttle Bus System by demonstrating its accuracy in capturing the bunching dynamics of the shuttle buses. Furthermore, we have analyzed the efficacy of three intervention strategies: holding, no-boarding, and centralized-pulsing, against bus bunching by incorporating the rule set of these strategies into the model. Under the scenario where the buses have the same velocity, we found that all three strategies improve both the waiting and travelling times of the commuters. However, when the buses have different velocities, only the centralized-pulsing scheme consistently outperforms the control scenario where the buses periodically bunch together.


2021 ◽  
Vol 36 ◽  
pp. 04001
Author(s):  
Siew Khew Koh ◽  
Ching Herny Chin ◽  
Yi Fei Tan ◽  
Tan Ching Ng

A single-server queueing system with negative customers is considered in this paper. One positive customer will be removed from the head of the queue if any negative customer is present. The distribution of the interarrival time for the positive customer is assumed to have a rate that tends to a constant as time t tends to infinity. An alternative approach will be proposed to derive a set of equations to find the stationary probabilities. The stationary probabilities will then be used to find the stationary queue length distribution. Numerical examples will be presented and compared to the results found using the analytical method and simulation procedure. The advantage of using the proposed alternative approach will be discussed in this paper.


Author(s):  
Yan Chen ◽  
Ward Whitt

In order to understand queueing performance given only partial information about the model, we propose determining intervals of likely values of performance measures given that limited information. We illustrate this approach for the mean steady-state waiting time in the $GI/GI/K$ queue. We start by specifying the first two moments of the interarrival-time and service-time distributions, and then consider additional information about these underlying distributions, in particular, a third moment and a Laplace transform value. As a theoretical basis, we apply extremal models yielding tight upper and lower bounds on the asymptotic decay rate of the steady-state waiting-time tail probability. We illustrate by constructing the theoretically justified intervals of values for the decay rate and the associated heuristically determined interval of values for the mean waiting times. Without extra information, the extremal models involve two-point distributions, which yield a wide range for the mean. Adding constraints on the third moment and a transform value produces three-point extremal distributions, which significantly reduce the range, producing practical levels of accuracy.


Water ◽  
2019 ◽  
Vol 11 (12) ◽  
pp. 2489 ◽  
Author(s):  
Luis Angel Espinosa ◽  
Maria Manuela Portela ◽  
João Dehon Pontes Filho ◽  
Ticiana Marinho de Carvalho Studart ◽  
João Filipe Santos ◽  
...  

The paper refers to a study on droughts in a small Portuguese Atlantic island, namely Madeira. The study aimed at addressing the problem of dependent drought events and at developing a copula-based bivariate cumulative distribution function for coupling drought duration and magnitude. The droughts were identified based on the Standardized Precipitation Index (SPI) computed at three and six-month timescales at 41 rain gauges distributed over the island and with rainfall data from January 1937 to December 2016. To remove the spurious and short duration-dependent droughts a moving average filter (MA) was used. The run theory was applied to the smoothed SPI series to extract the drought duration, magnitude, and interarrival time for each drought category. The smoothed series were also used to identify homogeneous regions based on principal components analysis (PCA). The study showed that MA is necessary for an improved probabilistic interpretation of drought analysis in Madeira. It also showed that despite the small area of the island, three distinct regions with different drought temporal patterns can be identified. The copulas approach proved that the return period of droughts events can differ significantly depending on the way the relationship between drought duration and magnitude is accounted for.


Author(s):  
Andrea Lira-Loarca ◽  
Manuel Cobos ◽  
Asunción Baquerizo ◽  
Miguel A. Losada

The design and management of a coastal structure must take into account not only the different levels of damage along its useful life but also the construction, reparation and dismantling costs. Therefore, it should be addressed as an optimization problem that depends on random multivariate climate variables. In this context it is essential to develop tools that allow the simulation of storms taking into account all the main maritime variables and their evolution (Borgman, 1969). In general, most studies focusing on storm characterization and evolution use geometric shapes like the equivalent triangular storm (Bocotti, 2000; ROM-1.0; 2009) to characterize individual storms. Actual storms have, however, irregular and random histories. In this work, we present a simple and efficient methodology to simulate time-series of storm events including several maritime variables. This methodology includes the use of non-stationary parametric distributions (Solari, 2011) to characterize each variable, a vector autoregressive (VAR) model to describe the temporal dependence between variables, and a copula model to link the seasonal dependency of the storm duration and the interarrival time between consecutive storms.


Author(s):  
Genaro Hernández-Valdez ◽  
Felipe Alejandro Cruz-Pérez ◽  
José Omar Labastida-Cuadra ◽  
Grethell Georgina Pérez-Sanchez

Cognitive radio technology was developed for spectral efficiency improvement in mobile communication networks. This is achieved by allowing secondary users to opportunistically and transparently use the white spaces of the primary network. In this paper, the effect of the service time distribution of the primary users in the statistics of the white spaces is analyzed. In particular, the first two standardized moments of both, white space time and white space interarrival time are found, considering the following distributions for the primary service time: log-normal, Weibull, and Pareto. One of the most relevant results is that, for low (moderate or high) traffic load, the exponential (hiper-exponential) distribution is an excellent option for modelling the white space duration. Characterizing these variables allow us to use the on-off paradigm to capture the primary channel activity and, in this way, evaluate the performance of cognitive radio networks. Results were obtained by using discrete event simulation techniques.


Sign in / Sign up

Export Citation Format

Share Document