scholarly journals INFLUENCE OF THE VARIATION COEFFICIENT ON THE BRICK STRUCTURES RELIABILITY

Author(s):  
Ksenia Olegovna Dubrakova ◽  
Ekaterina Pakhomova ◽  
Viacheslav Aseev

The goal of research was studying the influence of the variation coefficient values during statistical tests of building structures on the destruction probability and reliability of brick building structures. The article focuses on comparing the theoretical values of the probability theory problems with their practical application in building structures behavior calculation. The scientific novelty lies in the comparison of the calculated data according to the normal distribution with the practical values of the experiment carried out on brick structures. As a result, a dependence graph dependence of the reliability coefficient k on the brick grade was, taking into account various coefficients of variation.

Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 31
Author(s):  
Mariusz Specht

Positioning systems are used to determine position coordinates in navigation (air, land and marine). The accuracy of an object’s position is described by the position error and a statistical analysis can determine its measures, which usually include: Root Mean Square (RMS), twice the Distance Root Mean Square (2DRMS), Circular Error Probable (CEP) and Spherical Probable Error (SEP). It is commonly assumed in navigation that position errors are random and that their distribution are consistent with the normal distribution. This assumption is based on the popularity of the Gauss distribution in science, the simplicity of calculating RMS values for 68% and 95% probabilities, as well as the intuitive perception of randomness in the statistics which this distribution reflects. It should be noted, however, that the necessary conditions for a random variable to be normally distributed include the independence of measurements and identical conditions of their realisation, which is not the case in the iterative method of determining successive positions, the filtration of coordinates or the dependence of the position error on meteorological conditions. In the preface to this publication, examples are provided which indicate that position errors in some navigation systems may not be consistent with the normal distribution. The subsequent section describes basic statistical tests for assessing the fit between the empirical and theoretical distributions (Anderson-Darling, chi-square and Kolmogorov-Smirnov). Next, statistical tests of the position error distributions of very long Differential Global Positioning System (DGPS) and European Geostationary Navigation Overlay Service (EGNOS) campaigns from different years (2006 and 2014) were performed with the number of measurements per session being 900’000 fixes. In addition, the paper discusses selected statistical distributions that fit the empirical measurement results better than the normal distribution. Research has shown that normal distribution is not the optimal statistical distribution to describe position errors of navigation systems. The distributions that describe navigation positioning system errors more accurately include: beta, gamma, logistic and lognormal distributions.


Minerals ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 465
Author(s):  
Cezary Polakowski ◽  
Magdalena Ryżak ◽  
Agata Sochan ◽  
Michał Beczek ◽  
Rafał Mazur ◽  
...  

Particle size distribution is an important soil parameter—therefore precise measurement of this characteristic is essential. The application of the widely used laser diffraction method for soil analysis continues to be a subject of debate. The precision of this method, proven on homogeneous samples, has been implicitly extended to soil analyses, but this has not been sufficiently well confirmed in the literature thus far. The aim of this study is to supplement the information available on the precision of the method in terms of reproducibility of soil measurement and whether the reproducibility of soil measurement is characterized by a normal distribution. To estimate the reproducibility of the laser diffraction method, thirteen various soil samples were characterized, and results were analysed statistically. The coefficient of variation acquired was lowest (3.44%) for silt and highest for sand (23.28%). Five of the thirteen tested samples were characterized by a normal distribution. The fraction content of eight samples was not characterized by normal distribution, but the extent of this phenomenon varied between soils. Although the laser diffraction method is repeatable, the measurement of soil particle size distribution can have limited reproducibility. The main cause seems to be small amounts of sand particles. The error can be amplified by the construction of the dispersion unit. Non-parametric statistical tests should be used by default for soil laser diffraction method analysis.


2021 ◽  
Vol 10 (8) ◽  
pp. e51310817726
Author(s):  
Isabelle Vital Ortiz ◽  
Paula Vanessa Pedron Oltramari ◽  
Graziela Hernandes Volpato ◽  
Thais Maria Freire Fernandes Poleti ◽  
Victor de Miranda Ladewig ◽  
...  

The present research aim was to assess how occlusal contacts change along the initial 6 months of orthodontic treatment with fixed appliance and clear aligners. A sample with 40 patients was divided into 2 groups: Clear Aligners (CA) and Fixed Appliance (FA). In order to register occlusal contacts, patients were posit and instructed about how bite in habitual maximum intercuspidation. Registrations were perfomed monthly along 6 starting treatment months and noted in an occlusogram. A parametric test was applied to evaluate data since it presented a normal distribution according to Shapiro-Wilk test. For inter- and intergroup data analysis the Anova test was performed with a 5% significance level. Statistical tests were executed on Jamovi software (Jamovi Stats, Version1.2, Sydeney, Australia). There was a reduction in the amount of occlusal contacts for indivduals from both groups, CA and FA. Such reduction was more significative in the 3 starting months for FA group and between 3rd and 4th months for CA group. Therefore, the sort orthodontic appliance had no significant influence over occlusal contacts.


1990 ◽  
Vol 24 (1) ◽  
pp. 1-4 ◽  
Author(s):  
R. Sanz Sampelayo ◽  
J. Fonolla ◽  
F. Gil Extremera

A study was carried out to examine the distribution of individual weights in Helix aspersa snails, the aims being to establish the best estimate of the ponderal growth and also to obtain a model growth curve. Four groups of 20 snails from the same clutch were analysed and kept under experimental conditions from birth up to 6 months. The variability of their individual weights within groups was studied by calculating the coefficients of variation every 15 days. At the same time, the assumed normal distribution of those weights was being tested. The coefficients of variation increased with age and the assumed normal distribution of individual weights had to be rejected. By means of a log transformation of the original data, a model growth curve was constructed, and was used to assess the possibility of estimating age from weight. We finally reached the conclusion that median weight, rather than the mean, would be a better measure of central tendency to use until it is possible to obtain selected populations. The difficulty of estimating age from weight is emphasized.


2018 ◽  
Author(s):  
Daniel Mortlock

Mathematics is the language of quantitative science, and probability and statistics are the extension of classical logic to real world data analysis and experimental design. The basics of mathematical functions and probability theory are summarized here, providing the tools for statistical modeling and assessment of experimental results. There is a focus on the Bayesian approach to such problems (ie, Bayesian data analysis); therefore, the basic laws of probability are stated, along with several standard probability distributions (eg, binomial, Poisson, Gaussian). A number of standard classical tests (eg, p values, the t-test) are also defined and, to the degree possible, linked to the underlying principles of probability theory. This review contains 5 figures, 1 table, and 15 references. Keywords: Bayesian data analysis, mathematical models, power analysis, probability, p values, statistical tests, statistics, survey design


1988 ◽  
Vol 71 (1) ◽  
pp. 41-43
Author(s):  
Octave J Francis ◽  
George M Ware ◽  
Allen S Carman ◽  
Gary P Kirschenheuter ◽  
Shia S Kuan

Abstract Data were gathered, during a study on the development of an automated system for the extraction, cleanup, and quantitation of mycotoxins in corn, to determine if it was scientifically sound to reduce the analytical sample size. Five, 10, and 25 g test portions were analyzed and statistically compared with 50 g test portions of the same composites for aflatoxin concentration variance. Statistical tests used to determine whether the 10 and 50 g sample sizes differed significantly showed a satisfactory observed variance ratio (Fobs) of 2.03 for computations of pooled standard deviations; paired f-test values of 0.952, 1.43, and 0.224 were computed for each of the 3 study samples. The results meet acceptable limits, since each sample’s r-test result is less than the published value of the |t|, which is 1.6909 for the test conditions. The null hypothesis is retained since the sample sizes do not give significantly different values for the mean analyte concentration. The percent coefficients of variation (CVs) for all samples tested were within the expected range. In addition, the variance due to sample mixing was evaluated using radioisotopelabeled materials, yielding an acceptable CV of 22.2%. The variance due to the assay procedure was also evaluated and showed an aflatoxin B, recovery of 78.9% and a CV of 11.4%. Results support the original premise that a sufficiently ground and blended sample would produce an analyte variance for a 10 g sample that was statistically comparable with that for a 50 g sample.


1973 ◽  
Vol 53 (2) ◽  
pp. 177-183 ◽  
Author(s):  
W. STANEK

pH values were measured on peat samples taken from a water-logged peatland in Ontario, from April 1970 to April 1971, by 14 procedures: on fresh peat and groundwater, in their natural state; and on combinations of hand-squeezed, air-dried, and oven-dried peat, each rewetted to liquid limit with either distilled H2O, N/100 CaCl2∙2H2O, N/10 KCl, or N/10 CaCl2∙2H2O. Groundwater showed the highest mean pH (4.0), followed by hand-squeezed peat rewetted with distilled H2O (3.8), then fresh peat (3.6). In comparison with fresh peat, air and oven drying lowered the mean pH value by 0.1 and 0.2 units, rewetting with N/100 CaCl2∙2H2O, by 0.4; N/10 KCl, by 0.5; and N/10 CaCl2∙2H2O, by 0.6 units approximately. The coefficients of variation and the confidence limits showed, for practical application, that all methods were equally reliable and that pH determined at any time of the year validly characterized a site.


2019 ◽  
Vol 29 (8) ◽  
pp. 2179-2197
Author(s):  
Hua He ◽  
Wan Tang ◽  
Tanika Kelly ◽  
Shengxu Li ◽  
Jiang He

Measures of substance concentration in urine, serum or other biological matrices often have an assay limit of detection. When concentration levels fall below the limit, the exact measures cannot be obtained. Instead, the measures are censored as only partial information that the levels are under the limit is known. Assuming the concentration levels are from a single population with a normal distribution or follow a normal distribution after some transformation, Tobit regression models, or censored normal regression models, are the standard approach for analyzing such data. However, in practice, it is often the case that the data can exhibit more censored observations than what would be expected under the Tobit regression models. One common cause is the heterogeneity of the study population, caused by the existence of a latent group of subjects who lack the substance measured. For such subjects, the measurements will always be under the limit. If a censored normal regression model is appropriate for modeling the subjects with the substance, the whole population follows a mixture of a censored normal regression model and a degenerate distribution of the latent class. While there are some studies on such mixture models, a fundamental question about testing whether such mixture modeling is necessary, i.e. whether such a latent class exists, has not been studied yet. In this paper, three tests including Wald test, likelihood ratio test and score test are developed for testing the existence of such latent class. Simulation studies are conducted to evaluate the performance of the tests, and two real data examples are employed to illustrate the tests.


2018 ◽  
Vol 931 ◽  
pp. 1249-1254 ◽  
Author(s):  
Vadim N. Kabanov

The article proposes the solution of the topical issues that arise in the construction of buildings and structures, among them are: ensuring high reliability in calculating the duration of work in construction projects, determining the minimum boundary of the actual intensity of work in the construction of building structures. Objective: to offer a simple option for assessing the value of the reliability of the construction process. Within the framework of the research, the following tasks were solved: the typification of construction processes according to the degree of mechanization was carried out, a simple method of probability theory was proposed to determine the reliability value. Methods of forming the general set of initial values ​​suggest using observation and timing. Processing of statistical values ​​was performed on the model of the accumulated probability curve. As a result of the research, the construction processes are considered as fully mechanized, non-mechanized (manual) and processes in which the machines and workers are jointly occupied. A practical example of obtaining the values ​​of productivity from a given value of reliability on the curve of accumulated probabilities is given. The conclusion is made about the advantages of the described approach, which consist in the simplicity and low laboriousness of constructing a model, as well as performing calculations.


2017 ◽  
Vol 10 ◽  
pp. 107-119
Author(s):  
A.S. Guimarães ◽  
J.M.P.Q. Delgado ◽  
V.P. de Freitas

Salt damage can affect the service life of numerous building structures, both historical and contemporary, in a significant way. Therefore, various conservation methods have been developed for the consolidation and protection of porous building materials exposed to the salt attack. As any successful treatment of salt damage requires a multidisciplinary attitude, many different factors such as salt solution transport and crystallization, presence and origin of salts in masonry, and salt-induced deterioration are to be taken into account. The importance of pre-treatment investigations is discussed as well; in a combination with the knowledge of salt and moisture transport mechanisms they can give useful indications regarding treatment options.Another important cause of building pathologies in buildings is the rising damp and this phenomenon it is particularly more severe with the presence of salts in water. The treatment of rising damp in historic building walls is a very complex procedure. At Laboratory of Building Physics (LFC-FEUP) a wall base hygro-regulated ventilation system was developed. This system patented, HUMIVENT, has been submitted to laboratorial monitoring and to in situ validation and a numerical simplified model was developed to facilitate the practical application. Having in mind the practical application of scientific and technological knowledge from Building Physics to practice, this paper presents the design of the system (geometry, ventilation rate and hygrothermal device), the detailing and technical specification of its different components and information about the implementation in three types of buildings: a church, a museum and a residential building.


Sign in / Sign up

Export Citation Format

Share Document