random variable
Recently Published Documents


TOTAL DOCUMENTS

2698
(FIVE YEARS 677)

H-INDEX

50
(FIVE YEARS 5)

2022 ◽  
Author(s):  
Ye Xiaoming ◽  
Ding Shijun ◽  
Liu Haibo

Abstract In the traditional measurement theory, precision is defined as the dispersion of measured value, and is used as the basis of weights calculation in the adjustment of measurement data with different qualities, which leads to the trouble that trueness is completely ignored in the weight allocation. In this paper, following the pure concepts of probability theory, the measured value (observed value) is regarded as a constant, the error as a random variable, and the variance is the dispersion of all possible values of an unknown error. Thus, a rigorous formula for weights calculation and variance propagation is derived, which solves the theoretical trouble of determining the weight values in the adjustment of multi-channel observation data with different qualities. The results show that the optimal weights are not only determined by the covariance array of observation errors, but also related to the model of adjustment.


PLoS ONE ◽  
2022 ◽  
Vol 17 (1) ◽  
pp. e0262316
Author(s):  
Xi Guo ◽  
Abhineet Gupta ◽  
Anand Sampat ◽  
Chengwei Zhai

The COVID-19 pandemic has drastically shifted the way people work. While many businesses can operate remotely, a large number of jobs can only be performed on-site. Moreover as businesses create plans for bringing workers back on-site, they are in need of tools to assess the risk of COVID-19 for their employees in the workplaces. This study aims to fill the gap in risk modeling of COVID-19 outbreaks in facilities like offices and warehouses. We propose a simulation-based stochastic contact network model to assess the cumulative incidence in workplaces. First-generation cases are introduced as a Bernoulli random variable using the local daily new case rate as the success rate. Contact networks are established through randomly sampled daily contacts for each of the first-generation cases and successful transmissions are established based on a randomized secondary attack rate (SAR). Modification factors are provided for SAR based on changes in airflow, speaking volume, and speaking activity within a facility. Control measures such as mask wearing are incorporated through modifications in SAR. We validated the model by comparing the distribution of cumulative incidence in model simulations against real-world outbreaks in workplaces and nursing homes. The comparisons support the model’s validity for estimating cumulative incidences for short forecasting periods of up to 15 days. We believe that the current study presents an effective tool for providing short-term forecasts of COVID-19 cases for workplaces and for quantifying the effectiveness of various control measures. The open source model code is made available at github.com/abhineetgupta/covid-workplace-risk.


Author(s):  
Oren Fivel ◽  
Moshe Klein ◽  
Oded Maimon

In this paper we develop the foundation of a new theory for decision trees based on new modeling of phenomena with soft numbers. Soft numbers represent the theory of soft logic that addresses the need to combine real processes and cognitive ones in the same framework. At the same time soft logic develops a new concept of modeling and dealing with uncertainty: the uncertainty of time and space. It is a language that can talk in two reference frames, and also suggest a way to combine them. In the classical probability, in continuous random variables there is no distinguishing between the probability involving strict inequality and non-strict inequality. Moreover, a probability involves equality collapse to zero, without distinguishing among the values that we would like that the random variable will have for comparison. This work presents Soft Probability, by incorporating of Soft Numbers into probability theory. Soft Numbers are set of new numbers that are linear combinations of multiples of ”ones” and multiples of ”zeros”. In this work, we develop a probability involving equality as a ”soft zero” multiple of a probability density function (PDF). We also extend this notion of soft probabilities to the classical definitions of Complements, Unions, Intersections and Conditional probabilities, and also to the expectation, variance and entropy of a continuous random variable, condition being in a union of disjoint intervals and a discrete set of numbers. This extension provides information regarding to a continuous random variable being within discrete set of numbers, such that its probability does not collapse completely to zero. When we developed the notion of soft entropy, we found potentially another soft axis, multiples of 0log(0), that motivates to explore the properties of those new numbers and applications. We extend the notion of soft entropy into the definition of Cross Entropy and Kullback–Leibler-Divergence (KLD), and we found that a soft KLD is a soft number, that does not have a multiple of 0log(0). Based on a soft KLD, we defined a soft mutual information, that can be used as a splitting criteria in decision trees with data set of continuous random variables, consist of single samples and intervals.


Entropy ◽  
2022 ◽  
Vol 24 (1) ◽  
pp. 82
Author(s):  
Jean-Marc Girault ◽  
Sébastien Ménigot

Today, the palindromic analysis of biological sequences, based exclusively on the study of “mirror” symmetry properties, is almost unavoidable. However, other types of symmetry, such as those present in friezes, could allow us to analyze binary sequences from another point of view. New tools, such as symmetropy and symmentropy, based on new types of palindromes allow us to discriminate binarized 1/f noise sequences better than Lempel–Ziv complexity. These new palindromes with new types of symmetry also allow for better discrimination of binarized DNA sequences. A relative error of 6% of symmetropy is obtained from the HUMHBB and YEAST1 DNA sequences. A factor of 4 between the slopes obtained from the linear fits of the local symmentropies for the two DNA sequences shows the discriminative capacity of the local symmentropy. Moreover, it is highlighted that a certain number of these new palindromes of sizes greater than 30 bits are more discriminating than those of smaller sizes assimilated to those from an independent and identically distributed random variable.


Author(s):  
Chun Su ◽  
Kui Huang ◽  
Zejun Wen

To improve the probability that an engineering system successfully completes its next mission, it is crucial to implement timely maintenance activities, especially when maintenance time or maintenance resources are limited. Taking series-parallel system as the object of study, this paper develops a multi-objective imperfect selective maintenance optimization model. Among it, during the scheduled breaks, potential maintenance actions are implemented for the components, ranging from minimal repair to replacement. Considering that the level of maintenance actions is closely related to the maintenance cost, age reduction coefficient and hazard rate adjustment coefficient are taken into account. Moreover, improved hybrid hazard rate approach is adopted to describe the reliability improvement of the components, and the mission duration is regarded as a random variable. On this basis, a nonlinear stochastic optimization model is established with dual objectives to minimize the total maintenance cost and maximize the system reliability concurrently. The fast elitist non-dominated sorting genetic algorithm (NSGA-II) is adopted to solve the model. Numerical experiments are conducted to verify the effectiveness of the proposed approach. The results indicate that the proposed model can obtain better scheduling schemes for the maintenance resources, and more flexible maintenance plans are gained.


2022 ◽  
Vol 2022 (1) ◽  
pp. 013302
Author(s):  
Jean-Marc Luck

Abstract We consider non-Hermitian PT -symmetric tight-binding chains where gain/loss optical potentials of equal magnitudes ±iγ are arbitrarily distributed over all sites. The main focus is on the threshold γ c beyond which PT -symmetry is broken. This threshold generically falls off as a power of the chain length, whose exponent depends on the configuration of optical potentials, ranging between 1 (for balanced periodic chains) and 2 (for unbalanced periodic chains, where each half of the chain experiences a non-zero mean potential). For random sequences of optical potentials with zero average and finite variance, the threshold is itself a random variable, whose mean value decays with exponent 3/2 and whose fluctuations have a universal distribution. The chains yielding the most robust PT -symmetric phase, i.e. the highest threshold at fixed chain length, are obtained by exact enumeration up to 48 sites. This optimal threshold exhibits an irregular dependence on the chain length, presumably decaying asymptotically with exponent 1, up to logarithmic corrections.


2021 ◽  
Author(s):  
Gaowei Xu ◽  
Fae Azhari

The United States National Bridge Inventory (NBI) records element-level condition ratings on a scale of 0 to 9, representing failed to excellent conditions. Current bridge management systems apply Markov decision processes to find optimal repair schemes given the condition ratings. The deterioration models used in these approaches fail to consider the effect of structural age. In this study, a condition-based bridge maintenance framework is proposed where the state of a bridge component is defined using a three-dimensional random variable that depicts the working age, condition rating, and initial age. The proportional hazard model with a Weibull baseline hazard function translates the three-dimensional random variable into a single hazard indicator for decision-making. To demonstrate the proposed method, concrete bridge decks were taken as the element of interest. Two optimal hazard criteria help select the repair scheme (essential repair, general repair, or no action) that leads to minimum annual expected life-cycle costs.


2021 ◽  
Vol 19 (6) ◽  
pp. 575-583
Author(s):  
Rasha Atwa ◽  
Rasha Abd- El - Wahab ◽  
Ola Barakat

The stochastic approximation procedure with delayed groups of delayed customers is investigated. The Robbins-Monro stochastic approximation procedure is adjusted to be usable in the presence of delayed groups of delayed customers. Two loss systems are introduced to get an accurate description of the proposed procedure. Each customer comes after fixed time-intervals with the stage of the following customer is accurate according to the outcome of the preceding one, where the serving time of a customer is assumed to be discrete random variable. Some applications of the procedure are given where the analysis of their results is obtained. The analysis shows that efficiencies of the procedure can be increased by minimizing the number of customers of a group irrespective of their service times that may take maximum values. Efficiencies depend on the maximum service time of the customer and on the number of customers of the group. The most important result is that efficiencies of the procedure are increased by increasing the service time distributions as well as service times of customers .This new situation can be applied to increase the number of served customers where the number of served groups will also be increased. The results obtained seem to be acceptable. In general, our proposal can be utilized to other stochastic approximation procedures to increase the production in many fields such as medicine, computer sciences, industry, and applied sciences.


Author(s):  
Nathan Ritchey ◽  
Rajeev Rajaram

We provide methodology and numerical results for the Hattendorff differential equa- tion for the continuous time evolution of the variance of L(j)t , the loss at time t random variable for a multi-state process, given that the state at time t is j.


Sign in / Sign up

Export Citation Format

Share Document