random quantity
Recently Published Documents


TOTAL DOCUMENTS

21
(FIVE YEARS 9)

H-INDEX

2
(FIVE YEARS 1)

2022 ◽  
Vol 24 (1) ◽  
pp. 105-118
Author(s):  
Mervat Mahdy ◽  
◽  
Dina S. Eltelbany ◽  
Hoda Mohammed ◽  
◽  
...  

Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called 𝐻𝑁- entropy, unlike Shannon entropy, this proposed measure of order α and β is more flexible than Shannon. Then, the cumulative residual 𝐻𝑁- entropy, cumulative 𝐻𝑁- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and 𝐻𝑁- entropy and numerical results have been introduced.


Mathematics ◽  
2020 ◽  
Vol 8 (11) ◽  
pp. 1901
Author(s):  
Pierpaolo Angelini ◽  
Fabrizio Maturo

There exist uncertain situations in which a random event is not a measurable set, but it is a point of a linear space inside of which it is possible to study different random quantities characterized by non-parametric probability distributions. We show that if an event is not a measurable set then it is contained in a closed structure which is not a σ-algebra but a linear space over R. We think of probability as being a mass. It is really a mass with respect to problems of statistical sampling. It is a mass with respect to problems of social sciences. In particular, it is a mass with regard to economic situations studied by means of the subjective notion of utility. We are able to decompose a random quantity meant as a geometric entity inside of a metric space. It is also possible to decompose its prevision and variance inside of it. We show a quadratic metric in order to obtain the variance of a random quantity. The origin of the notion of variability is not standardized within this context. It always depends on the state of information and knowledge of an individual. We study different intrinsic properties of non-parametric probability distributions as well as of probabilistic indices summarizing them. We define the notion of α-distance between two non-parametric probability distributions.


TOS forum ◽  
2020 ◽  
Vol 2020 (10) ◽  
pp. 31
Author(s):  
Geoff Lyman

The value of a fully statistical sampling theory is that it is possible to quantify a measure of material intrinsic heterogeneity and, on this basis, provide the entire distribution of the analyte content of potential samples to be extracted from the lot. The analyte content of a sample of a given mass is a random quantity as samples of nominally equal masses taken from a lot in a given state of comminution will not have exactly the sample analyte content. The analyte content of a sample is correctly described as a random variable and to characterise a random variable completely it is necessary to know either the probability density function or distribution function for the random variable, or all of the moments of the random variable (mean, variance and all the higher moments). The following discussion derives the fundamental sampling variance from a purely mathematical statistics basis, relying on the assumption that the number of particles of any one type (size and analyte content) that fall into a sample taken in a mechanically correct manner (following the principle of equiprobable sampling) follows a Poisson distribution. In addition, the Poisson distributions of particle numbers are statistically independent. A more fully argued substantiation of this fundamental assumption, partial experimental evidence and standard statistical introduction to the definition and properties of the Poisson distribution, and reasons for its use, can be found at the end of this article. © Materials Sampling & Consulting 2020


2020 ◽  
Vol 25 (2) ◽  
pp. 33
Author(s):  
Juan Carlos Cortés López ◽  
Marc Jornet Sanz

Kernel density estimation is a non-parametric method to estimate the probability density function of a random quantity from a finite data sample. The estimator consists of a kernel function and a smoothing parameter called the bandwidth. Despite its undeniable usefulness, the convergence rate may be slow with the number of realizations and the discontinuity and peaked points of the target density may not be correctly captured. In this work, we analyze the applicability of a parametric method based on Monte Carlo simulation for the density estimation of certain random variable transformations. This approach has important applications in the setting of differential equations with input random parameters.


2020 ◽  
Vol 54 (05) ◽  
pp. 148-152
Author(s):  
Tamila Ahmad Savdumova ◽  

Key words: Mathematics, probability theory, statistics, random quantity


Risks ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 60
Author(s):  
Francesco Giuseppe Cordoni ◽  
Luca Di Persio ◽  
Yilun Jiang

The present paper is devoted to the study of a bank salvage model with a finite time horizon that is subjected to stochastic impulse controls. In our model, the bank’s default time is a completely inaccessible random quantity generating its own filtration, then reflecting the unpredictability of the event itself. In this framework the main goal is to minimize the total cost of the central controller, which can inject capitals to save the bank from default. We address the latter task, showing that the corresponding quasi-variational inequality (QVI) admits a unique viscosity solution—Lipschitz continuous in space and Hölder continuous in time. Furthermore, under mild assumptions on the dynamics the smooth-fit W l o c ( 1 , 2 ) , p property is achieved for any 1 < p < + ∞ .


2020 ◽  
Vol 12 (4) ◽  
pp. 95 ◽  
Author(s):  
Pierpaolo Angelini

If we study the expected utility function then we deal with a unified approach to an integrated formulation of decision theory in its two subjective components: utility and probability. We decompose the expected utility function inside of an m-dimensional linear space after decomposing a contingent consumption plan viewed as a univariate random quantity. We propose a condition of coherence compatible with all possible attitudes in the face of risk of a consumer. It is a geometric condition of coherence. In particular, we consider a risk-neutral consumer and his coherent decisions under uncertainty. The right closed structure in order to deal with utility and probability is a linear space in which we study coherent decisions under uncertainty having as their goal the maximization of the prevision of the utility associated with a contingent consumption bundle.


2019 ◽  
Vol 11 (3) ◽  
pp. 1
Author(s):  
Angelini Pierpaolo

We propose an original mathematical model according to a Bayesian approach explaining uncertainty from a point of view connected with vector spaces. A parameter space can be represented by means of random quantities by accepting the principles of the theory of concordance into the domain of subjective probability. We observe that metric properties of the notion of $\alpha$-product mathematically fulfill the ones of a coherent prevision of a bivariate random quantity. We introduce fundamental metric expressions connected with transformed random quantities representing changes of origin. We obtain a posterior probability law by applying the Bayes&#39; theorem into a geometric context connected with a two-dimensional parameter space.


Sign in / Sign up

Export Citation Format

Share Document