scholarly journals The probabilistic solution of stochastic oscillators with even nonlinearity under poisson excitation

Open Physics ◽  
2012 ◽  
Vol 10 (3) ◽  
Author(s):  
Siu-Siu Guo ◽  
Guo-Kang Er

AbstractThe probabilistic solutions of nonlinear stochastic oscillators with even nonlinearity driven by Poisson white noise are investigated in this paper. The stationary probability density function (PDF) of the oscillator responses governed by the reduced Fokker-Planck-Kolmogorov equation is obtained with exponentialpolynomial closure (EPC) method. Different types of nonlinear oscillators are considered. Monte Carlo simulation is conducted to examine the effectiveness and accuracy of the EPC method in this case. It is found that the PDF solutions obtained with EPC agree well with those obtained with Monte Carlo simulation, especially in the tail regions of the PDFs of oscillator responses. Numerical analysis shows that the mean of displacement is nonzero and the PDF of displacement is nonsymmetric about its mean when there is even nonlinearity in displacement in the oscillator. Numerical analysis further shows that the mean of velocity always equals zero and the PDF of velocity is symmetrically distributed about its mean.

2013 ◽  
Vol 740-742 ◽  
pp. 393-396
Author(s):  
Maxim N. Lubov ◽  
Jörg Pezoldt ◽  
Yuri V. Trushin

The influence of attractive and repulsive impurities on the nucleation process of the SiC clusters on Si(100) surface was investigated. Kinetic Monte Carlo simulations of the SiC clusters growth show that that increase of the impurity concentration (both attractive and repulsive) leads to decrease of the mean cluster size and rise of the nucleation density of the clusters.


2020 ◽  
Vol 3 (3) ◽  
pp. 533
Author(s):  
Josua Guntur Putra ◽  
Jane Sekarsari

One of the keys to success in construction execution is timeliness. In fact, construction is often late than originally planned. It’s caused by project scheduling uncertainty. Deterministic scheduling methods use data from previous projects to determine work duration. However, not every project has same work duration. The PERT method provides a probabilistic approach that can overcome these uncertainties, but it doesn’t account for the increase in duration due to parallel activities. In 2017, the PERT method was developed into the M-PERT method. The purpose of this study is to compare the mean duration and standard deviation of the overall project between PERT and M-PERT methods and compare them in Monte Carlo simulation. The research method used is to calculate the mean duration of the project with the PERT, M-PERT, and Monte Carlo simulation. The study was applied to a three-story building project. From the results of the study, the standard deviation obtained was 5.079 for the M-PERT method, 8.915 for the PERT method, and 5.25 for the Monte Carlo simulation. These results show the M-PERT method can provide closer results to computer simulation result than the PERT method. Small standard deviation value indicates the M-PERT method gives more accurate results.ABSTRAKSalah satu kunci keberhasilan dalam suatu pelaksanaan konstruksi adalah ketepatan waktu. Kenyataannya, pelaksanaan konstruksi sering mengalami keterlambatan waktu dari yang direncanakan. Hal ini disebabkan oleh ketidakpastian dalam merencanakan penjadwalan proyek. Metode penjadwalan yang bersifat deterministik menggunakan data dari proyek sebelumnya untuk menentukan durasi pekerjaan. Akan tetapi, tidak setiap proyek memiliki durasi pekerjaan yang sama. Metode PERT memberikan pendekatan probabilistik yang dapat mengatasi ketidakpastian tersebut, tetapi metode ini tidak memperhitungkan pertambahan durasi akibat adanya kegiatan yang berbentuk paralel. Pada tahun 2017, metode PERT dikembangkan menjadi metode M-PERT. Tujuan dari penelitian ini adalah membandingkan mean durasi dan standar deviasi proyek secara keseluruhan antara metode PERT dan M-PERT dan membandingkan kedua metode tersebut dalam simulasi Monte Carlo. Metode penelitian yang dilakukan adalah menghitung mean durasi proyek dengan metode PERT, M-PERT, dan simulasi Monte Carlo. Penelitian diterapkan pada proyek gedung bertingkat tiga. Dari hasil penelitian, nilai standar deviasi diperoleh sebesar 5,079 untuk metode M-PERT, 8,915 untuk metode PERT, dan 5,25 untuk simulasi Monte Carlo. Hasil ini menunjukan metode M-PERT dapat memberikan hasil yang lebih mendekati hasil simulasi komputer daripada metode PERT. Nilai standar deviasi yang kecil menunjukan metode M-PERT memberikan hasil yang lebih akurat.


2020 ◽  
Vol 5 (4) ◽  
pp. 64
Author(s):  
Themis Matsoukas

We formulate the statistics of the discrete multicomponent fragmentation event using a methodology borrowed from statistical mechanics. We generate the ensemble of all feasible distributions that can be formed when a single integer multicomponent mass is broken into fixed number of fragments and calculate the combinatorial multiplicity of all distributions in the set. We define random fragmentation by the condition that the probability of distribution be proportional to its multiplicity, and obtain the partition function and the mean distribution in closed form. We then introduce a functional that biases the probability of distribution to produce in a systematic manner fragment distributions that deviate to any arbitrary degree from the random case. We corroborate the results of the theory by Monte Carlo simulation, and demonstrate examples in which components in sieve cuts of the fragment distribution undergo preferential mixing or segregation relative to the parent particle.


2013 ◽  
Vol 869-870 ◽  
pp. 581-592
Author(s):  
Mauro Arnesano ◽  
Antonio Paolo Carlucci ◽  
Giovanni D'Oria ◽  
Alessio Guadalupi ◽  
Domenico Laforgia

The energy planning based on Mean - Variance theory, guides the investors in investment decisions, trying to maximize the return and minimize the risk of investment. However, this theory is based on strong hypotheses and, in addition, input data are often affected by estimation errors. Moreover, this theory determines poor diversification increasing return and risk of the portfolio, and strong variability of the outputs when inputs are varied.In the first part of the paper, the Mean - Variance theory was applied to the energy generation in Italy; in particular, the analysis was on the actual energy mix, but also assuming the use of nuclear technology and taking into account verisimilar improvement, of technologies in the future.On the other hand, in the second part of the paper, a methodology has been applied in order to limit the problems of Mean-Variance theory applied to the energy mix settlement. In particular, the input variables have been calculated using Monte Carlo simulation, in order to reduce the estimation error, and the Resampled EfficiencyTMtechnique has been applied in order to calculate the resulting new “average” efficient frontier. This methodology has been applied either not limiting or limiting the minimum and maximum percentage for every energy generation technology, in order to simulate constraints due, for example, to the technological characteristics of the plant, the availability of the sources and eventually to norms, to the territorial characteristics and to the socio-political choices. The application of Mean - Variance theory allowed to obtain energy portfolio, alternative to the actual, characterized by higher values of expected returns an lower values of risk.It was also shown that the application of the Resampled EfficiencyTMtechnique with data originated with the Monte Carlo simulation effectively tackles the problems of Mean - Variance theory; in this way, the decision maker is helped in making decisions in the energy system policy and development.Thanks to this approach, applied in particular to the Italian energy contest, it was also possible to evaluate the effectiveness of the introduced modifications to the Italian actual energy mix to achieve the 2020 European Energy Directive targets in particular concerning the reduction of CO2levels.


1998 ◽  
Vol 14 (1) ◽  
pp. 165-188 ◽  
Author(s):  
Yutaka Nakamura ◽  
Tsuneyoshi Nakamura

A direct procedure is presented for generating a response spectrum for an arbitrary nonexceedance probability from a prescribed design mean response spectrum. An amplification factor is derived to estimate the maximum response values of an MDOF system for a nonexceedance probability from the mean maximum ones. An efficient stiffness design method for a shear building is developed with the use of its fundamental frequency and translational eigenvector as parameters for adjusting the nonexceedance probability of the seismic drifts to the specified value. The validity and accuracy of the proposed method are demonstrated by a Monte Carlo simulation together with time-history analyses.


2001 ◽  
Vol 38 (A) ◽  
pp. 176-187 ◽  
Author(s):  
Mark Bebbington ◽  
David S. Harte

The paper reviews the formulation of the linked stress release model for large scale seismicity together with aspects of its application. Using data from Taiwan for illustrative purposes, models can be selected and verified using tools that include Akaike's information criterion (AIC), numerical analysis, residual point processes and Monte Carlo simulation.


1979 ◽  
Vol 34 (3) ◽  
pp. 253-267 ◽  
Author(s):  
Ranajit Chakraborty ◽  
Paul A. Fuerst

SUMMARYSome sampling properties related with the mean and variance of the number of alleles and single locus heterozygosity are derived to study the effect of variations in mutation rate of selectively neutral alleles. The correlation between single locus heterozygosity and the number of alleles is also derived. Monte Carlo simulation is conducted to examine the effect of stepwise mutations. The relevance of these results in estimating the population parameter, 4Neν, is discussed in connexion with neutralist-selectionist controversy over the maintenance of genetic variability in natural populations.


2005 ◽  
Vol 62 (5) ◽  
pp. 1529-1544 ◽  
Author(s):  
Ken-ichi Maruyama ◽  
Yasushi Fujiyoshi

Abstract A stochastic microphysical model of snow aggregation that combines a simple aggregation model with a Monte Carlo method was developed. Explicit treatment of the shape of individual snowflakes in the new model facilitates examination of the structure of snowflakes and the relationships between the parameters of the generated snowflakes, such as mass versus diameter, in addition to comparisons with observations. In this study, complexities in the shape of snowflakes are successfully simulated, and the understanding of the evolution of their size distribution is advanced. The mean diameter of snow particles evolves more rapidly in the aggregate model than in the sphere model. However, growth rates of the aggregates greatly depend on the collision section of particles in aggregation. The mean mass of snowflakes in the aggregate model grows more slowly than the mass in the sphere model when the sum of the particle cross section is used as the collision cross section. The mean mass grows more quickly when a circle is used whose radius is the sum of the radii of two particles. Sensitivity experiments showed that aggregation also depends on the mean and standard deviation of the initial distribution, and on the density of constituent particles.


2017 ◽  
Vol 22 (2) ◽  
pp. 490-514 ◽  
Author(s):  
Michael T. Brannick ◽  
Sean M. Potter ◽  
Bryan Benitez ◽  
Scott B. Morris

We describe a new estimator (labeled Morris) for meta-analysis. The Morris estimator combines elements of both the Schmidt-Hunter and Hedges estimators. The new estimator is compared to (a) the Schmidt-Hunter estimator, (b) the Schmidt-Hunter estimator with variance correction for the number of studies (“ k correction”), (c) the Hedges random-effects estimator, and (d) the Bonett unit weights estimator in a Monte Carlo simulation. The simulation was designed to represent realistic conditions faced by researchers, including population random-effects distributions, numbers of studies, and skewed sample size distributions. The simulation was used to evaluate the estimators with respect to bias, coverage of the 95% confidence interval of the mean, and root mean square error of estimates of the population mean. We also evaluated the quality of credibility intervals. Overall, the new estimator provides better coverage and slightly better credibility values than other commonly used methods. Thus it has advantages of both commonly used approaches without the apparent disadvantages. The new estimator can be implemented easily with existing software; software used in the study is available online, and an example is included in the appendix in the Supplemental Material available online.


Sign in / Sign up

Export Citation Format

Share Document