scholarly journals Stochastic Properties of Confidence Ellipsoids after Least Squares Adjustment, Derived from GUM Analysis and Monte Carlo Simulations

Mathematics ◽  
2020 ◽  
Vol 8 (8) ◽  
pp. 1318 ◽  
Author(s):  
Wolfgang Niemeier ◽  
Dieter Tengen

In this paper stochastic properties are discussed for the final results of the application of an innovative approach for uncertainty assessment for network computations, which can be characterized as two-step approach: As the first step, raw measuring data and all possible influencing factors were analyzed, applying uncertainty modeling in accordance with GUM (Guide to the Expression of Uncertainty in Measurement). As the second step, Monte Carlo (MC) simulations were set up for the complete processing chain, i.e., for simulating all input data and performing adjustment computations. The input datasets were generated by pseudo random numbers and pre-set probability distribution functions were considered for all these variables. The main extensions here are related to an analysis of the stochastic properties of the final results, which are point clouds for station coordinates. According to Cramer’s central limit theorem and Hagen’s elementary error theory, there are some justifications for why these coordinate variations follow a normal distribution. The applied statistical tests on the normal distribution confirmed this assumption. This result allows us to derive confidence ellipsoids out of these point clouds and to continue with our quality assessment and more detailed analysis of the results, similar to the procedures well-known in classical network theory. This approach and the check on normal distribution is applied to the local tie network of Metsähovi, Finland, where terrestrial geodetic observations are combined with Global Navigation Satellite System (GNSS) data.

2021 ◽  
Vol 12 (33) ◽  
pp. 34-49
Author(s):  
Iuliia Pinkovetskaia ◽  
Yulia Nuretdinova ◽  
Ildar Nuretdinov ◽  
Natalia Lipatova

One of the urgent tasks in many modern scientific studies is the comparative analysis of indicators that characterize large sets of similar objects located in different regions. Given the significant differences between the regions compared, this analysis should be carried out using relative indicators. The objective of the study was to use the density functions of the normal distribution to model empirical data that describe the compared sets of objects located in different regions. The methodological approach was based on the Chebyshev and Lyapunov theorems. The research results focus on the main stages of the construction of normal distribution functions and the corresponding histograms, as well as the determination of the parameters of these functions. The work possesses a degree of originality, since it provides answers to questions such as the justification of the necessary information base; performing computational experiments and developing alternative options for the generation of normal distribution density functions; comprehensive evaluation of the quality of the functions obtained through three statistical tests: Pearson, Kolmogorov-Smirnov, Shapiro-Wilk; identification of patterns that characterize the distribution of indicators of the sets of objects considered. Examples of empirical data models are given using distribution functions to estimate the share of innovative firms in the total number of firms in the regions of Russia.


2017 ◽  
Vol 11 (2) ◽  
Author(s):  
Wolfgang Niemeier ◽  
Dieter Tengen

AbstractIn this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis.In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy.The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well.This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation.As practical example the local tie network in “Metsähovi Fundamental Station”, Finland is used, where classical geodetic observations are combined with GNSS data.


Mathematics ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 817
Author(s):  
Fernando López ◽  
Mariano Matilla-García ◽  
Jesús Mur ◽  
Manuel Ruiz Marín

A novel general method for constructing nonparametric hypotheses tests based on the field of symbolic analysis is introduced in this paper. Several existing tests based on symbolic entropy that have been used for testing central hypotheses in several branches of science (particularly in economics and statistics) are particular cases of this general approach. This family of symbolic tests uses few assumptions, which increases the general applicability of any symbolic-based test. Additionally, as a theoretical application of this method, we construct and put forward four new statistics to test for the null hypothesis of spatiotemporal independence. There are very few tests in the specialized literature in this regard. The new tests were evaluated with the mean of several Monte Carlo experiments. The results highlight the outstanding performance of the proposed test.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 31
Author(s):  
Mariusz Specht

Positioning systems are used to determine position coordinates in navigation (air, land and marine). The accuracy of an object’s position is described by the position error and a statistical analysis can determine its measures, which usually include: Root Mean Square (RMS), twice the Distance Root Mean Square (2DRMS), Circular Error Probable (CEP) and Spherical Probable Error (SEP). It is commonly assumed in navigation that position errors are random and that their distribution are consistent with the normal distribution. This assumption is based on the popularity of the Gauss distribution in science, the simplicity of calculating RMS values for 68% and 95% probabilities, as well as the intuitive perception of randomness in the statistics which this distribution reflects. It should be noted, however, that the necessary conditions for a random variable to be normally distributed include the independence of measurements and identical conditions of their realisation, which is not the case in the iterative method of determining successive positions, the filtration of coordinates or the dependence of the position error on meteorological conditions. In the preface to this publication, examples are provided which indicate that position errors in some navigation systems may not be consistent with the normal distribution. The subsequent section describes basic statistical tests for assessing the fit between the empirical and theoretical distributions (Anderson-Darling, chi-square and Kolmogorov-Smirnov). Next, statistical tests of the position error distributions of very long Differential Global Positioning System (DGPS) and European Geostationary Navigation Overlay Service (EGNOS) campaigns from different years (2006 and 2014) were performed with the number of measurements per session being 900’000 fixes. In addition, the paper discusses selected statistical distributions that fit the empirical measurement results better than the normal distribution. Research has shown that normal distribution is not the optimal statistical distribution to describe position errors of navigation systems. The distributions that describe navigation positioning system errors more accurately include: beta, gamma, logistic and lognormal distributions.


Minerals ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 465
Author(s):  
Cezary Polakowski ◽  
Magdalena Ryżak ◽  
Agata Sochan ◽  
Michał Beczek ◽  
Rafał Mazur ◽  
...  

Particle size distribution is an important soil parameter—therefore precise measurement of this characteristic is essential. The application of the widely used laser diffraction method for soil analysis continues to be a subject of debate. The precision of this method, proven on homogeneous samples, has been implicitly extended to soil analyses, but this has not been sufficiently well confirmed in the literature thus far. The aim of this study is to supplement the information available on the precision of the method in terms of reproducibility of soil measurement and whether the reproducibility of soil measurement is characterized by a normal distribution. To estimate the reproducibility of the laser diffraction method, thirteen various soil samples were characterized, and results were analysed statistically. The coefficient of variation acquired was lowest (3.44%) for silt and highest for sand (23.28%). Five of the thirteen tested samples were characterized by a normal distribution. The fraction content of eight samples was not characterized by normal distribution, but the extent of this phenomenon varied between soils. Although the laser diffraction method is repeatable, the measurement of soil particle size distribution can have limited reproducibility. The main cause seems to be small amounts of sand particles. The error can be amplified by the construction of the dispersion unit. Non-parametric statistical tests should be used by default for soil laser diffraction method analysis.


1991 ◽  
Vol 46 (4) ◽  
pp. 351-356
Author(s):  
Bernd M. Rode

Abstract Monte Carlo simulations of a system of 200 water and 24 NaCl molecules at 6 different densities in the range from 0.003 g/cm3 to 0.999 g,/cm3 and T = 125 °C and 225 CC were performed to obtain some insight into cluster formation which should precede and determine the formation of aerosol structures and has possibly played some role in prebiotic atmosphere chemistry. Solute hydration occurs already at very low concentrations mainly in the form of hydrated molecules ("contact ion pairs"). At higher densities larger cluster structures are observed, leading rather continuously to the structure of the supersaturated 7.1 M NaCl solution at the same temperature. Radial distribution functions, coordination numbers and particle interaction energies are discussed with respect to the simulation parameters density and temperature


2002 ◽  
Vol 4 (3) ◽  
pp. 183-190 ◽  
Author(s):  
W. Hitzl ◽  
G. Grabner

The comparison of different methods of keratoprosthesis (KP) regarding their long-term success, as far as visual acuity is concerned, is difficult: this is the case both as a standardized reporting method agreed upon by all research groups has not been reported and far less accepted, and as the quality of life for the patient not only depends on the level of visual acuity, but also quite significantly on the “survival time” of the implant. Therefore, an analysis of a single series of patients with Osteo–Odonto–Keratoprosthesis (OOKP) was performed. Statistical analysis methods used by others in similar groups of surgical procedures have included descriptive statistics, survival analysis and ANOVA. These methods comprised comparisons of empirical densities or distribution functions and empirical survival curves. It is the objective of this paper to provide an inductive statistical method to avoid the problems with descriptive techniques and survival analysis. This statistical model meets four important standards: (1) the efficiency of a surgical technique can be assessed within an arbitrary time interval by a new index (VAT-index), (2) possible autocorrelations of the data are taken into consideration and (3) the efficiency is not only stated by a point estimator, but also 95% point-wise confidence limits are computed based on the Monte Carlo method, and finally, (4) the efficiency of a specific method is illustrated by line and range plots for quick illustration and can also be used for the comparison of different other surgical techniques such as refractive techniques, glaucoma and retinal surgery.


2017 ◽  
Vol 29 (4) ◽  
pp. 1267-1278 ◽  
Author(s):  
Marco Del Giudice

AbstractStatistical tests of differential susceptibility have become standard in the empirical literature, and are routinely used to adjudicate between alternative developmental hypotheses. However, their performance and limitations have never been systematically investigated. In this paper I employ Monte Carlo simulations to explore the functioning of three commonly used tests proposed by Roisman et al. (2012). Simulations showed that critical tests of differential susceptibility require considerably larger samples than standard power calculations would suggest. The results also showed that existing criteria for differential susceptibility based on the proportion of interaction index (i.e., values between .40 and .60) are especially likely to produce false negatives and highly sensitive to assumptions about interaction symmetry. As an initial response to these problems, I propose a revised test based on a broader window of proportion of interaction index values (between .20 and .80). Additional simulations showed that the revised test outperforms existing tests of differential susceptibility, considerably improving detection with little effect on the rate of false positives. I conclude by noting the limitations of a purely statistical approach to differential susceptibility, and discussing the implications of the present results for the interpretation of published findings and the design of future studies in this area.


Sign in / Sign up

Export Citation Format

Share Document