scholarly journals Aspects of Modern Systemic Approach (III): Implications of Random Processes in the Study of Dynamic Systems

Author(s):  
Bogdan-Vasile Cioruța

The purpose of this study is to familiarize the reader with the diversity of concepts (notions) and stages of development specific to Probability Theory, with the definition and characterization of variables, vectors and random processes, respectively with the most important elements that give random processes. Among these we mention the distribution function, the probability density function, the statistical moments of a random process, the temporal averages, and respectively the correlation (and intercorrelation) of a random signal (process). Also, the implications of random processes in the study of dynamical systems are reviewed, as well as a series of applications specific to the analysis of dynamic behavior.

Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3837
Author(s):  
Rafael Orellana ◽  
Rodrigo Carvajal ◽  
Pedro Escárate ◽  
Juan C. Agüero

In control and monitoring of manufacturing processes, it is key to understand model uncertainty in order to achieve the required levels of consistency, quality, and economy, among others. In aerospace applications, models need to be very precise and able to describe the entire dynamics of an aircraft. In addition, the complexity of modern real systems has turned deterministic models impractical, since they cannot adequately represent the behavior of disturbances in sensors and actuators, and tool and machine wear, to name a few. Thus, it is necessary to deal with model uncertainties in the dynamics of the plant by incorporating a stochastic behavior. These uncertainties could also affect the effectiveness of fault diagnosis methodologies used to increment the safety and reliability in real-world systems. Determining suitable dynamic system models of real processes is essential to obtain effective process control strategies and accurate fault detection and diagnosis methodologies that deliver good performance. In this paper, a maximum likelihood estimation algorithm for the uncertainty modeling in linear dynamic systems is developed utilizing a stochastic embedding approach. In this approach, system uncertainties are accounted for as a stochastic error term in a transfer function. In this paper, we model the error-model probability density function as a finite Gaussian mixture model. For the estimation of the nominal model and the probability density function of the parameters of the error-model, we develop an iterative algorithm based on the Expectation-Maximization algorithm using the data from independent experiments. The benefits of our proposal are illustrated via numerical simulations.


2021 ◽  
Vol 48 (3) ◽  
pp. 91-96
Author(s):  
Shigeo Shioda

The consensus achieved in the consensus-forming algorithm is not generally a constant but rather a random variable, even if the initial opinions are the same. In the present paper, we investigate the statistical properties of the consensus in a broadcasting-based consensus-forming algorithm. We focus on two extreme cases: consensus forming by two agents and consensus forming by an infinite number of agents. In the two-agent case, we derive several properties of the distribution function of the consensus. In the infinite-numberof- agents case, we show that if the initial opinions follow a stable distribution, then the consensus also follows a stable distribution. In addition, we derive a closed-form expression of the probability density function of the consensus when the initial opinions follow a Gaussian distribution, a Cauchy distribution, or a L´evy distribution.


1979 ◽  
Vol 23 (03) ◽  
pp. 188-197
Author(s):  
Michel K. Ochi

This paper discusses the effect of statistical dependence of the maxima (peak values) of a stationary random process on the magnitude of the extreme values. A theoretical analysis of the extreme values of a stationary normal random process is made, assuming the maxima are subject to the Markov chain condition. For this, the probability distribution function of maxima as well as the joint probability distribution function of two successive maxima of a normal process having an arbitrary spectral bandwidth are applied to Epstein's theorem for evaluating the extreme values in a given sample under the Markov chain condition. A numerical evaluation of the extreme values is then carried out for a total of 14 random processes, including nine ocean wave records, with various spectral bandwidth parameters ranging from 0.11 to 0.78. From the results of the computations, it is concluded that the Markov concept is applicable to the maxima of random processes whose spectral bandwidth parameter, ɛ, is less than 0.5, and that the extreme values with and without the Markov concept are constant irrespective of the e-value, and the former is approximately 10 percent greater than the latter. It is also found that the sample size for which the extreme value reaches a certain level with the Markov concept is much less than that without the Markov concept. For example, the extreme value will reach a level of 4.0 (nondimensional value) in 1100 observations of the maxima with the Markov concept, while the extreme value will reach the same level in 3200 observations of the maxima without the Markov concept.


Author(s):  
Robert J Marks II

In this Chapter, we present application of Fourier analysis to probability, random variables and stochastic processes [1089, 1097, 1387, 1329]. Arandom variable, X, is the assignment of a number to the outcome of a random experiment. We can, for example, flip a coin and assign an outcome of a heads as X = 1 and a tails X = 0. Often the number is equated to the numerical outcome of the experiment, such as the number of dots on the face of a rolled die or the measurement of a voltage in a noisy circuit. The cumulative distribution function is defined by FX(x) = Pr[X ≤ x]. (4.1) The probability density function is the derivative fX(x) = d /dxFX(x). Our treatment of random variables focuses on use of Fourier analysis. Due to this viewpoint, the development we use is unconventional and begins immediately in the next section with discussion of properties of the probability density function.


2012 ◽  
Vol 226-228 ◽  
pp. 1106-1110 ◽  
Author(s):  
Dong Qin ◽  
Xue Qin Zheng ◽  
Song Lin Wang

The paper, based on analyzing original monitoring data, employs forward and backward cloud algorithm in studying determining safety-monitoring index for concrete dam ,which integrates randomness and fuzziness into of qualitative concept of digital features. By means of above monitoring data, its digital characteristics can be easily transformed to the “quantitative-qualitative- quantitative” change. The final generated quantitative value constitutes the cloud diagram where each droplet demonstrates the characterization of raw monitoring data. At the same time, it also shows the randomness and fuzziness of monitored value. we can study out the safety monitoring indexes according to different remarkable levels by using the probability density function and deterministic function which completed by cloud algorithm. In the end, it is obtained with practice that this method is more suitable and reliability.


1969 ◽  
Vol 6 (02) ◽  
pp. 442-448
Author(s):  
Lionel Weiss

Suppose Q 1 ⋆, … Q n ⋆ are independent, identically distributed random variables, each with probability density function f(x), cumulative distribution function F(x), where F(1) – F(0) = 1, f(x) is continuous in the open interval (0, 1) and continuous on the right at x = 0 and on the left at x = 1, and there exists a positive C such that f(x) > C for all x in (0, l). f(0) is defined as f(0+), f(1) is defined as f(1–).


2006 ◽  
pp. 55
Author(s):  
Valerie Bérenger ◽  
Franck Celestini

The goal of this paper is to define a multidimensional poverty score for each household belonging to the same society in order to answer the following question: is it possible to characterize poverty, as it is income, by analyzing the nature of an associated probability density function? Based on the “Totally Fuzzy and Relative” (TFR) approach, the method proposed permits obtaining individual poverty scores lying between 0 and infinity. We apply the method to the data from the 1986-1987 and the 1993-1994 French Surveys of Living Conditions. In both cases, the probability density function of the poverty scores follows an exponential distribution characterized by a single parameter. We then examine the relationship between our multidimensional poverty score and income. The intuitive negative correlation is recovered and fully analyzed. In particular, using our poverty score distribution we estimate the poverty line that gives the best agreement between income-based and multidimensional measure of poverty.


Sign in / Sign up

Export Citation Format

Share Document