Method for Constructing a Confidence Interval for the Variance of a Random Variable

Vestnik NSUEM ◽  
2021 ◽  
pp. 146-155
Author(s):  
A. V. Ganicheva ◽  
A. V. Ganichev

The problem of reducing the number of observations for constructing a confidence interval of variance with a given degree of accuracy and reliability is considered. The new method of constructing an interval estimate of variance developed in the article is formulated by three statements and justified by four proven theorems. Formulas for calculating the required number of observations depending on the accuracy and reliability of the estimate are derived. The results of the calculations are presented in the table and shown in the diagram. The universality and effectiveness of this method is shown. The universality of the method lies in the fact that it is applicable to any laws of probability distribution, and not only for the normal law. The effectiveness of the developed method is justified by comparing its performance with other known methods.

2020 ◽  
Vol 92 (6) ◽  
pp. 51-58
Author(s):  
S.A. SOLOVYEV ◽  

The article describes a method for reliability (probability of non-failure) analysis of structural elements based on p-boxes. An algorithm for constructing two p-blocks is shown. First p-box is used in the absence of information about the probability distribution shape of a random variable. Second p-box is used for a certain probability distribution function but with inaccurate (interval) function parameters. The algorithm for reliability analysis is presented on a numerical example of the reliability analysis for a flexural wooden beam by wood strength criterion. The result of the reliability analysis is an interval of the non-failure probability boundaries. Recommendations are given for narrowing the reliability boundaries which can reduce epistemic uncertainty. On the basis of the proposed approach, particular methods for reliability analysis for any structural elements can be developed. Design equations are given for a comprehensive assessment of the structural element reliability as a system taking into account all the criteria of limit states.


2017 ◽  
Vol 928 (10) ◽  
pp. 58-63 ◽  
Author(s):  
V.I. Salnikov

The initial subject for study are consistent sums of the measurement errors. It is assumed that the latter are subject to the normal law, but with the limitation on the value of the marginal error Δpred = 2m. It is known that each amount ni corresponding to a confidence interval, which provides the value of the sum, is equal to zero. The paradox is that the probability of such an event is zero; therefore, it is impossible to determine the value ni of where the sum becomes zero. The article proposes to consider the event consisting in the fact that some amount of error will change value within 2m limits with a confidence level of 0,954. Within the group all the sums have a limit error. These tolerances are proposed to use for the discrepancies in geodesy instead of 2m*SQL(ni). The concept of “the law of the truncated normal distribution with Δpred = 2m” is suggested to be introduced.


1980 ◽  
Vol 17 (4) ◽  
pp. 1016-1024 ◽  
Author(s):  
K. D. Glazebrook

A collection of jobs is to be processed by a single machine. The amount of processing required by each job is a random variable with a known probability distribution. The jobs must be processed in a manner which is consistent with a precedence relation but the machine is free to switch from one job to another at any time; such switches are costly, however. This paper discusses conditions under which there is an optimal strategy for allocating the machine to the jobs which is given by a fixed permutation of the jobs indicating in which order they should be processed. When this is so, existing algorithms may be helpful in giving the best job ordering.



2021 ◽  
Author(s):  
Faezeh Ghasemnezhad ◽  
Ommolbanin Bazrafshan ◽  
Mehdi Fazeli ◽  
Mohammad Parvinnia ◽  
Vijay Singh

Abstract Standardized Runoff Index (SRI), as one of the well-known hydrological drought indices, may contain uncertainties caused by the employment of the distribution function, time scale, and record length of statistical data. In this study, the uncertainty in the SRI estimation of monthly discharge data of 30 and 49 year length from Minab dam watershed, south of Iran, was investigated. Four probability distribution functions (Gamma, Weibull, Lognormal, and Normal) were used to fit the cumulative discharge data at 3, 6. 9, 12, 24 and 48 month time scales, with their goodness-of-fit and normality evaluated by K-S and normality tests, respectively. Using Monte-Carlo sampling, 50,000 statistical data were generated for each event and each time scale, followed by 95% confidence interval. The width of the confidence interval was used as uncertainty and sources of uncertainty were investigated using miscellaneous factors. It was found that the maximum uncertainty was related to normal and lognormal distributions and the minimum uncertainty to gamma and Weibull distributions. Further, the increase in both time scale and record length led to the decrease in uncertainty.


Author(s):  
Lipeng Pan ◽  
Yong Deng

Dempster-Shafer evidence theory can handle imprecise and unknown information, which has attracted many people. In most cases, the mass function can be translated into the probability, which is useful to expand the applications of the D-S evidence theory. However, how to reasonably transfer the mass function to the probability distribution is still an open issue. Hence, the paper proposed a new probability transform method based on the ordered weighted averaging and entropy difference. The new method calculates weights by ordered weighted averaging, and adds entropy difference as one of the measurement indicators. Then achieved the transformation of the minimum entropy difference by adjusting the parameter r of the weight function. Finally, some numerical examples are given to prove that new method is more reasonable and effective.


2016 ◽  
Vol 5 (4) ◽  
pp. 106-113 ◽  
Author(s):  
Tamer El Nashar

The objective of this paper is to examine the impact of inclusive business on the internal ethical values and the internal control quality while conceiving the accounting perspective. I construct the hypothesis for this paper based on the potential impact on the organizations’ awareness to be directed to the inclusive business approach that will significantly impact the culture of the organizations then the ethical values and the internal control quality. I use the approach of the expected value and variance of random variable test in order to analyze the potential impact of inclusive business. I support the examination by discrete probability distribution and continuous probability distribution. I find a probability of 85.5% to have a significant potential impact of the inclusive business by 100% score on internal ethical values and internal control quality. And to help contribute to sustainability growth, reduce poverty and improve organizational culture and learning.


2020 ◽  
pp. 52-63
Author(s):  
M. Mullai*, K. Sangeetha, R. Surya, G. Madhan kumar, R. Jeyabalan ◽  
◽  
◽  
S. Broumi

This paper presents the problematic period of neutrosophic inventory in an inaccurate and unsafe mixed environment. The purpose of this paper is to present demand as a neutrosophic random variable. For this model, a new method is developed for determining the optimal sequence size in the presence of neutrosophic random variables. Where to get optimality by gradually expressing the average value of integration. The newsvendor problem is used to describe the proposed model.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


Sign in / Sign up

Export Citation Format

Share Document