probability theory
Recently Published Documents


TOTAL DOCUMENTS

3024
(FIVE YEARS 460)

H-INDEX

56
(FIVE YEARS 5)

2022 ◽  
Author(s):  
Ye Xiaoming ◽  
Ding Shijun ◽  
Liu Haibo

Abstract In the traditional measurement theory, precision is defined as the dispersion of measured value, and is used as the basis of weights calculation in the adjustment of measurement data with different qualities, which leads to the trouble that trueness is completely ignored in the weight allocation. In this paper, following the pure concepts of probability theory, the measured value (observed value) is regarded as a constant, the error as a random variable, and the variance is the dispersion of all possible values of an unknown error. Thus, a rigorous formula for weights calculation and variance propagation is derived, which solves the theoretical trouble of determining the weight values in the adjustment of multi-channel observation data with different qualities. The results show that the optimal weights are not only determined by the covariance array of observation errors, but also related to the model of adjustment.


2022 ◽  
pp. 335-362
Author(s):  
David M. Ruth
Keyword(s):  

Lithosphere ◽  
2022 ◽  
Vol 2022 (Special 4) ◽  
Author(s):  
Jie Ren ◽  
Yuan Wang ◽  
Di Feng ◽  
Jiakun Gong

Abstract Deep saline aquifers have strong heterogeneity under natural conditions, which affects the migration of carbon dioxide (CO2) injection into the reservoir. How to characterize the heterogeneity of rock mass is of great significance to research the CO2 migration law during CO2 storage. A method is proposed to construct different heterogeneous models from the point of view of whether the amount of data is sufficient or not, the wholly heterogeneous model with sufficient data, the deterministic multifacies heterogeneous model which is simplified by lithofacies classification, and the random multifacies heterogeneous model which is derived from known formation based on transfer probability theory are established, respectively. Numerical simulation is carried out to study the migration law of CO2 injected into the above three heterogeneous models. The results show that the migration of CO2 in heterogeneous deep saline aquifers shows a significant fingering flow phenomenon and reflect the physical process in CO2 storage; the migration law of CO2 in the deterministic multifacies heterogeneous model is similar to that in the wholly heterogeneous model and indicates that the numerical simulation of simplifying the wholly heterogeneous structure to the lithofacies classification structure is suitable for simulating the CO2 storage process. The random multifacies heterogeneous model based on the transfer probability theory accords with the development law of sedimentary formation and can be used to evaluate the CO2 migration law in unknown heterogeneous formations. On the other hand, by comparing the dry-out effect of CO2 in different heterogeneous models, it is pointed out that the multifacies characterization method will weaken the influence due to the local homogenization of the model in small-scale research; it is necessary to refine the grid and subdivide the lithofacies of the local key area elements to eliminate the research error. The research results provide feasible references and suggestions for the heterogeneous modeling of the missing data area and the simplification of large-scale heterogeneous models.


Dependability ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 20-25
Author(s):  
B. P. Zelentsov

The exponential distribution of time to event or end of state is popular in the dependability theory. This distribution is characterized by the strength that is a convenient parameter used in mathematical models and calculations. The exponential distribution is used as part of dependability-related process simulation. Examples are given to illustrate the applicability of the exponential distribution. Aim. The aim of the paper is to improve the dependability-related simulation methods when using the exponential distribution of periods of states or times to events. Methods. The assumption of the exponential distribution of time between events can be justified or discarded using methods of the probability theory and/or mathematical statistics or on the basis of personal or engineering experience. It has been experimentally established that the failure flow in an established mode of operation is stationary, ordinary and produces no consequences. Such flow is Poisson and is distinct in the fact that the time between two consecutive failures is distributed exponentially with a constant rate. This exponential distribution is reasonably extended to the distribution of an item’s failure-free time. However, in other cases, the use of exponential distribution is often not duly substantiated. The methodological approach and the respective conclusions are case-based. A number of experience-based cases are given to show the non-applicability of exponential distribution. Discussion. Cases are examined, in which the judgement on the applicability or non-applicability of exponential distribution can be made on the basis of personal experience or the probability theory. However, in case of such events as completion of recovery, duration of scheduled inspection, duration of maintenance, etc., a judgement regarding the applicability of exponential distribution cannot be made in the absence of personal experience associated with such events. The distribution of such durations is to be established using statistical methods. The paper refers to the author’s publications that compare the frequency of equipment inspections with regular and exponentially distributed periods. The calculated values of some indicators are retained, while for some others they are different. There is a two-fold difference between the unavailability values for the above ways of defining the inspection frequency. Findings and conclusions. The proposed improvements to the application of exponential distribution as part of dependability simulation come down to the requirement of clear substantiation of the application of exponential distribution of time between events using methods of the probability theory and mathematical statistics. An unknown random distribution cannot be replaced with an exponential distribution without a valid substantiation. Replacing a random time in a subset of states with a random exponentially distributed time with a constant rate should be done with an error calculation.


Author(s):  
O. DEMIANCHUK ◽  
A. BABARYKINA

Purpose. Improving approaches and methods of calculation and design of energy-efficient sorting slides of railway stations, taking into account the mechanization of their braking positions based on the use of modern car moderators, including energy efficient structures. Methodology. The research was performed using the methods of the theory of bitter calculations in combination with the tools of mathematical statistics and probability theory. The assessment of the possible economic effect was carried out taking into account the technical and economic calculations on the criterion of the given savings of annual costs. Results. An estimate of the predicted value of the speed of the very good runner at the entrance to the brake positions when calculating their needs. An adaptive approach to the calculation of the required power of the 1st and 2nd brake positions on the descent part of the sorting slides is proposed. The condition of technological reliability and "survivability" of the system of control of speeds of movement of couplings at rolling down from a sorting hill is checked taking into account action of probabilistic factors. The economic effect of reducing the need for car decelerators for the braking positions of the slides, as well as energy efficiency of sorting complexes. Practical value. The obtained important scientific and practical results to substantiate the reduction of energy consumption and increase the energy efficiency of sorting stations can be used to develop new projects and to survey the parameters of existing sorting complexes, including non-mechanized slides.


Author(s):  
L.A. Sladkova ◽  
◽  
P.A. Grigorev ◽  

A large number of domestic and foreign scientists were engaged in the wear of the surfaces of the excavating parts of earthmoving machines during their interaction with the ground, as well as the problem of the adhesion of soils on the surface of the working excavating parts of machines and methods of struggle. It is known that the classical theory of digging is based on empirical dependencies obtained experimentally. They allow, basically, to reveal the essence of the process of dredging, but they differ from each other, both in their description and in the number of results. In this article, the authors propose, on the basis of probability theory, theoretical studies of the process of dredging and the interaction of the excavating part with the soil. The authors of previous studies confirm the obtained results. The developed physical model of the interaction of soil particles with the surface of the excavating part during adhesion allows us to reveal the physical cause of the process of wear of the surface of the excavating parts.


2021 ◽  
Author(s):  
Robin Manhaeve ◽  
Giuseppe Marra ◽  
Thomas Demeester ◽  
Sebastijan Dumančić ◽  
Angelika Kimmig ◽  
...  

There is a broad consensus that both learning and reasoning are essential to achieve true artificial intelligence. This has put the quest for neural-symbolic artificial intelligence (NeSy) high on the research agenda. In the past decade, neural networks have caused great advances in the field of machine learning. Conversely, the two most prominent frameworks for reasoning are logic and probability. While in the past they were studied by separate communities, a significant number of researchers has been working towards their integration, cf. the area of statistical relational artificial intelligence (StarAI). Generally, NeSy systems integrate logic with neural networks. However, probability theory has already been integrated with both logic (cf. StarAI) and neural networks. It therefore makes sense to consider the integration of logic, neural networks and probabilities. In this chapter, we first consider these three base paradigms separately. Then, we look at the well established integrations, NeSy and StarAI. Next, we consider the integration of all three paradigms as Neural Probabilistic Logic Programming, and exemplify it with the DeepProbLog framework. Finally, we discuss the limitations of the state of the art, and consider future directions based on the parallels between StarAI and NeSy.


Sign in / Sign up

Export Citation Format

Share Document