normal probability distribution
Recently Published Documents


TOTAL DOCUMENTS

53
(FIVE YEARS 21)

H-INDEX

6
(FIVE YEARS 2)

2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Florin Pavel

This study focuses on the assessment of the correlation and variability of ground motion amplitudes recorded in Bucharest area during Vrancea intermediate-depth earthquakes from a database of 119 pairs of horizontal components. Empirical models for the evaluation of the peak ground velocity and displacement from spectral accelerations are proposed in this study. The distribution of the shear wave velocities from 41 boreholes at specific depths appears to follow a normal probability distribution. The analysis performed in this study has also shown that the variability of peak ground velocities and displacements does not appear to be influenced by the earthquake magnitude. In addition, it was observed that the variability in terms of shear wave velocities at specific depths is smaller than the variability of the spectral amplitudes of the recorded ground motions. The empirical site-amplification factors from the Eurocode 8 draft fail to capture the long-period spectral amplifications observed in Bucharest area during large magnitude Vrancea intermediate-depth earthquakes.


Author(s):  
Pavel Yu. Gubin ◽  
Vladislav P. Oboskalov

Currently, heuristic methods based on iterative changing of feasible solutions set provide a perspective tool for generation equipment maintenance scheduling in power systems. Wherein effectiveness of a heuristic method depends significantly on the initial set of possible schedules or in other words quality of the method initialization. In this case, a widely used methodology of building the initial array of solutions on the basis of pseudorandom uniform generation of control variables seems to be only palliative way to access the problem. This paper proposes alternative initialization procedure drawing on the example of generating units maintenance planning with heuristic differential evolution method. The principle of this method is to get initial set of solutions utilizing normal probability distribution to generate pseudorandom deviations from the suboptimal maintenance schedule which is to be preliminarily formed using directed search method. Following this approach allows to improve probabilistic characteristics of resultant maintenance schedule in particular to decrease median value of an objective function and its coefficient of variation, and to maximize probability to get the combination of units outage moments completely suiting operational constraints.


Author(s):  
O. Nejadseyfi ◽  
H. J. M. Geijselaers ◽  
E. H. Atzema ◽  
M. Abspoel ◽  
A. H. van den Boogaard

AbstractIn this work, metamodel-based robust optimization is performed using measured scatter of noise variables. Principal component analysis is used to describe the input noise using linearly uncorrelated principal components. Some of these principal components follow a normal probability distribution, others however deviate from a normal probability distribution. In that case, for more accurate description of material scatter, a multimodal distribution is used. An analytical method is implemented to propagate the noise distribution via metamodel and to calculate the statistics of the response accurately and efficiently. The robust optimization criterion as well as the constraints evaluation are adjusted to properly deal with multimodal response. Two problems are presented to show the effectiveness of the proposed approach and to validate the method. A basketball free throw in windy weather condition and forming of B-pillar component are presented. The significance of accounting for non-normal distribution of input variables using multimodal distributions is investigated. Moreover, analytical calculation of response statistics, and adjustment of the robust optimization problem are presented and discussed.


Author(s):  
Wayne D. Cottrell

Aims: Observe driver compliance with daytime headlights requirements along two-lane highways in California and Arizona.  Determine overall compliance rates, while identifying any statistical differences between highways. Study Design: Travel along highways having daytime headlight use requirements during daylight hours, recording ambient conditions and compliance.  Distinguish between cars, large commercial trucks, and motorcycles, and between manual (low-beam) and automated (very low-beam) headlights.  Add supportive information from synergistic research. Place and Duration of Study: California State Routes 4, 18, 74, 247, and U.S. Highway 95 in Arizona, during September 2010, and June and July 2015, over seven data collection days during the summer, and one on the first day of autumn. Methodology: Calculate average driver headlight compliance rates and deviations to a 95% level of confidence.  Assume that compliance follows a normal probability distribution pattern. Results: A total of 758 motor vehicles were observed.  Removing the 104 vehicles observed on a “cloudy” highway, 266 of the 654 drivers were using their headlights (40.7% ± 3.6% compliance).  There was no difference between the proportions of compliant drivers on the six highways (95% level of confidence).  A total of 66 of 104 drivers used their headlights under cloudy conditions (63.5% ± 9.6% compliance).  A Facebook survey of 24 respondents found that 20% of drivers were unaware of daytime headlights zones (DHZs), and an additional 13% were deliberately noncompliant.  Interviews of two California Highway Patrol officers revealed that citations for noncompliance were “not popular” (among the officers), and that there was some skepticism as to the effectiveness of the requirement. Conclusions: Further observation is needed under cloudy skies to develop a more precise proportion of compliance.  The low compliance suggests that the effectiveness of DHZs cannot be truly assessed.  Compliance might be improved with enhanced driver education, as to their existence and purpose, less reluctant enforcement, a revised headlight sign design, and more frequent signing.


Author(s):  
Matheus Sales Alves ◽  
Fernando José Araújo da Silva ◽  
André Luís Calado Araújo ◽  
Erlon Lopes Pereira

This paper assesses the reliability of Waste Stabilization Ponds (WSP) and proposes an alternative approach to WSP design based on the calculation of coefficient of reliability (COR) from an acceptable measure of violation of discharge standards. For that, data were collected from 10 full-scale systems operating in Northeast Brazil. All systems receive predominantly domestic effluent and are composed of one facultative pond and two serial maturation ponds. Different levels of restriction for effluent discharge were considered regarding the parameters: BOD, COD, total suspended solids, ammonia and thermotolerant coliforms. The Log-normal Probability Distribution Function (PDF) was able to represent the behavior of the concentration data in the effluent and, therefore, allowed the COR calculation. The COR was obtained from the coefficient of variation (CV) of the concentrations and the standardized normal variable associated with a 95% probability of non-exceedance. The observed dispersion of the results proved to be detrimental to the adoption of a single COR value for the evaluated parameters. In addition, the comparison between observed and design/operational concentration for optimal performance showed that the 95% reliability scenario represents a less achievable target for WSP systems.


2021 ◽  
pp. 153-169
Author(s):  
Rehan Ahmad Khan Sherwani ◽  
Muhammad Aslam ◽  
Muhammad Ali Raza ◽  
Muhammad Farooq ◽  
Muhammad Abid ◽  
...  

2021 ◽  
Author(s):  
Eliel Alves Ferreira ◽  
João Vicente Zamperion

This study aims to present the concepts and methods of statistical analysis using the Excel software, in a simple way aiming at a greater ease of understanding of students, both undergraduate and graduate, from different areas of knowledge. In Excel, mainly Data Analysis Tools will be used. For a better understanding, there are, in this book, many practical examples applying these tools and their interpretations, which are of paramount importance. In the first chapter, it deals with introductory concepts, such as introduction to Excel, the importance of statistics, concepts and definitions. Being that in this will be addressed the subjects of population and sample, types of data and their levels of measurement. Then it brings a detailed study of Descriptive Statistics, where it will be studied percentage, construction of graphs, frequency distribution, measures of central tendency and measures of dispersion. In the third chapter, notions of probability, binomial and normal probability distribution will be studied. In the last chapter, Inferential Statistics will be approached, starting with the confidence interval, going through the hypothesis tests (F, Z and t tests), ending with the statistical study of the correlation between variables and simple linear regression. It is worth mentioning that the statistical knowledge covered in this book can be useful for, in addition to students, professionals who want to improve their knowledge in statistics using Excel.


2021 ◽  
Vol 10 (3) ◽  
pp. 12-20
Author(s):  
Martina Kuncová

The situation on the Czech electricity market from the point of view of small customers or households is confusing every year. Although information on electricity consumption prices for households and small businesses is already freely available on the Internet (web pages of the Electricity Regulation Office), understanding the rules for calculating electricity consumption costs is still not easy for ordinary small consumers. For small entrepreneurs, the question often arises as to whether tariffs intended for households can be used for the electricity consumption, or whether it is necessary or appropriate to switch to tariffs for small business consumption. This article is focused on the analysis of the offer of electricity suppliers for the year 2020 in the Czech Republic from the point of view of the distribution rate D25d for households, resp. C25d for entrepreneurs in order to assess differences in the cost of electricity consumption and to select those products and suppliers for which the annual cost of electricity consumption is minimal. Monte Carlo simulation, where the monthly electricity consumption is generated (normal probability distribution), is used for the analysis together with the basics of multicriteria decision making, especially the non-dominance testing principle. The results show that the differences in the annual electricity consumption costs can be around 15% and the tariff rates for households are cheaper than the tariff rates for the entrepreneurs (also here the difference in annual costs can be around 15-20%).


Author(s):  
Antonina Ganicheva ◽  

The problem of estimating the number of summands of random variables for a total normal distribution law or a sample average with a normal distribution is investigated. The Central limit theorem allows us to solve many complex applied problems using the developed mathematical apparatus of the normal probability distribution. Otherwise, we would have to operate with convolutions of distributions that are explicitly calculated in rare cases. The purpose of this paper is to theoretically estimate the number of terms of the Central limit theorem necessary for the sum or sample average to have a normal probability distribution law. The article proves two theorems and two consequences of them. The method of characteristic functions is used to prove theorems. The first theorem States the conditions under which the average sample of independent terms will have a normal distribution law with a given accuracy. The corollary of the first theorem determines the normal distribution for the sum of independent random variables under the conditions of theorem 1. The second theorem defines the normal distribution conditions for the average sample of independent random variables whose mathematical expectations fall in the same interval, and whose variances also fall in the same interval. The corollary of the second theorem determines the normal distribution for the sum of independent random variables under the conditions of theorem 2. According to the formula relations proved in theorem 1, a table of the required number of terms in the Central limit theorem is calculated to ensure the specified accuracy of approximation of the distribution of the values of the sample average to the normal distribution law. A graph of this dependence is constructed. The dependence is well approximated by a polynomial of the sixth degree. The relations and proved theorems obtained in the article are simple, from the point of view of calculations, and allow controlling the testing process for evaluating students ' knowledge. They make it possible to determine the number of experts when making collective decisions in the economy and organizational management systems, to conduct optimal selective quality control of products, to carry out the necessary number of observations and reasonable diagnostics in medicine.


2020 ◽  
Vol 1013 ◽  
pp. 114-119
Author(s):  
Azhar Badaoui

The aim of this paper is the evaluation of concrete carbonation depth from a probabilistic analysis, focusing specifically on the study of the marble powder diameters randomness effect on the reinforced concrete carbonation. Monte Carlo simulations are realized under the assumption that the marble powder diameter (Dmp) is random variable with a log-normal probability distribution.


Sign in / Sign up

Export Citation Format

Share Document