scholarly journals The variability of the flatfoot frequency depending on the diagnostic criteria and the method of statistical analysis

2019 ◽  
Vol 7 (2) ◽  
pp. 41-50 ◽  
Author(s):  
Vladimir M. Kenis ◽  
Alyona J. Dimitrieva ◽  
Andrei V. Sapogovskiy

Background. Flatfoot frequency in children varies from 0.6% to 77.9%. This wide-range data is associated with lack of uniform diagnostic criteria and method of statistical analysis. Aim. This study aimed to demonstrate the variability in flatfoot frequency in the same population using different indices of footprint and methods of statistical analysis. Material and methods. This study included 317 school-age children. Children with orthopedic and foot pathology were excluded. The main evaluation methods were clinical examination, computer plantography with footprint index calculation (Staheli index, Chippaux–Smirak index, Clarke’s angle, podometric index, arch height index), and statistical analysis (descriptive statistics methods with Kolmogorov–Smirnov and Shapiro–Wilk criteria, data definition according to the law of normal distribution with standard deviation and quartile assessment). Results. According to the law of normal distribution (with a double standard deviation), our study demonstrated that the flatfoot frequency using the plantar footprint indices varies from 1.6% to 4.8% in 7–17-year-old children and using the medial footprint indices, from 1.28% to 2.8% in the same age. Quartile assessment method showed a flatfoot frequency of 5.85%–28.33% with plantar foot indices and 5.7%–15.43% with medial footprint indices. Conclusion. The different plantographic indices and methods of statistical analysis demonstrated that the frequency of a flattened longitudinal arch of the feet in a population may differ significantly. Thus, the frequency of flatfoot determined on the basis of indices calculated on the medial footprint is 1.7–1.8 times lower than that determined on the plantar footprint. In addition, the frequency of flatfoot is 5.5–5.9 times lower than that determined by the quartile assessment.

2019 ◽  
Vol 91 (3) ◽  
Author(s):  
Damian Grzesiak ◽  
Jarosław Plichta

The aim of this paper is to answer the question of the distribution of welding distortions. The MIG method was used to make 31 butt welds of 0H18N9 sheet metal, of 6 mm thickness and dimensions 150x350 mm. All joints are made with constant parameters of the welding process. Statistical analysis of the distribution and Kolomogorov-Smirnov test were used in this paper. On the grounds of the analysis it was proved that the distribution of welding deformations is a normal distribution. This justifies the use of experiment planning methods and the use of average values. The relatively high value of the standard deviation makes it necessary to take into account the geometrical parameters of the joint.


2020 ◽  
Vol 4 (9) ◽  
Author(s):  
Megan Wang

Basketball has existed for almost 130 years, becoming one of the most famous sports worldwide by affecting millions of lives and having national and global tournaments. With the general improvement of people's concern and love for sports competition, sports analytics’ role will become more prominent. Hence, this paper combines the relevant knowledge of statistics and typical basketball competition cases from NBA, expounding the application of statistics in sports competition. The paper first examines the importance of normal distribution (also called Gaussian distribution) in statistics through its probability density function and the function's graph. The function has two parameters: the mean for the maximum and standard deviation for the distance away from the mean[1]. By compiling datasets of past teams and individuals for their basketball performances and making simple calculations of their standard deviation and mean, the paper constructs normal distribution graphs using the R programming language. Finally, the paper examines the Real Plus-Minus value and its importance in basketball.


2014 ◽  
Vol 55 (1) ◽  
pp. 129-140
Author(s):  
Anna J. Kwiatkowska ◽  
Ewa Symonides

Homogeneity of the <em>Leucobryo-Pinetum</em> phytocoenose was assessed on the grounds of the agreement of frequency distributions of the total species diversity (A) and evenness (e) indices with the normal distribution. It was confirmed that: 1) empirical frequency distributions of H and e fitted the normal distribution only at some quadrat sizes; 2) values of mean, standard deviation and coefficient of variation were non-linear functions of the area size; 3) mean H and e values calculated for small quadrats (1 and 2 m<sup>2</sup>) differed from those calculated for average (4 and 8 m<sup>2</sup>) and large (16 and 32 m<sup>2</sup>) quadrats: 4) the area size at which frequency distributions of both indices were symmetrical determined the scale of spatial differentiation of the phytocoenose, under which it was homogeneous.


Author(s):  
Claudia Kimie Suemoto ◽  
Catherine Lee ◽  
Felipe Fregni

This chapter discusses the first step in statistical analysis: the descriptive statistics and data classification. It is the first contact of the researcher with the data, and it is very important to understand the characteristics of the sample, including the presence of missing data and outliers. To perform this step, the investigator needs to learn the types of variables (i.e. numerical and categorical), and the measures to summarize them. In the case of numerical variables, it is also important to check whether the data follow a normal distribution. Researchers usually use mean and standard deviation for numerical variables, and absolute and relative frequencies for categorical ones. In addition to summary measures, different graphs are used to represent the data. Understanding the data is a critical step in choosing the statistical test, as is discussed further in the next chapters.


2020 ◽  
Vol 2020 (17) ◽  
pp. 34-1-34-7
Author(s):  
Matthew G. Finley ◽  
Tyler Bell

This paper presents a novel method for accurately encoding 3D range geometry within the color channels of a 2D RGB image that allows the encoding frequency—and therefore the encoding precision—to be uniquely determined for each coordinate. The proposed method can thus be used to balance between encoding precision and file size by encoding geometry along a normal distribution; encoding more precisely where the density of data is high and less precisely where the density is low. Alternative distributions may be followed to produce encodings optimized for specific applications. In general, the nature of the proposed encoding method is such that the precision of each point can be freely controlled or derived from an arbitrary distribution, ideally enabling this method for use within a wide range of applications.


2017 ◽  
Vol 928 (10) ◽  
pp. 58-63 ◽  
Author(s):  
V.I. Salnikov

The initial subject for study are consistent sums of the measurement errors. It is assumed that the latter are subject to the normal law, but with the limitation on the value of the marginal error Δpred = 2m. It is known that each amount ni corresponding to a confidence interval, which provides the value of the sum, is equal to zero. The paradox is that the probability of such an event is zero; therefore, it is impossible to determine the value ni of where the sum becomes zero. The article proposes to consider the event consisting in the fact that some amount of error will change value within 2m limits with a confidence level of 0,954. Within the group all the sums have a limit error. These tolerances are proposed to use for the discrepancies in geodesy instead of 2m*SQL(ni). The concept of “the law of the truncated normal distribution with Δpred = 2m” is suggested to be introduced.


Author(s):  
Tim Lindsey ◽  
Simon Butt

This book explains Indonesia’s complex legal system and how it works. Covering a wide range of substantive topics from public to private law, including commercial, criminal, and constitutional law, it is the first comprehensive survey of Indonesian law in English. Offering clear answers to practical problems of current law, each chapter sets out relevant laws and leading court decisions, accompanied by an explanation of how the law works in practice, with an analytical critique. The book begins with an account of Indonesia’s Constitution and the key state agencies, before moving to the lawmaking process, decentralization, the judicial system and court procedure, and the legal profession (advocates, notaries, and legal aid). Part II covers traditional customary law (adat), land law, and environmental law, including forest law. Part III focuses on criminal law and procedure, including investigation, arrest, trial, sentencing, and appeals. It also covers human rights law and the law on corruption. Part IV deals with civil law, and covers civil liability, contracts, companies and other business vehicles, labour, foreign investment, taxation, insolvency, banking, competition, and media law. The book concludes in Part V with an account of Indonesia’s complex family law and inheritance system for both Muslims and non-Muslims. The book has an extensive glossary of legal terms, and detailed tables of legislation and court decisions, designed as unique resources for lawyers, policymakers, and researchers.


Author(s):  
Baoliang Chen ◽  
Peng Liu ◽  
Feiyun Xiao ◽  
Zhengshi Liu ◽  
Yong Wang

Quantitative assessment is crucial for the evaluation of human postural balance. The force plate system is the key quantitative balance assessment method. The purpose of this study is to review the important concepts in balance assessment and analyze the experimental conditions, parameter variables, and application scope based on force plate technology. As there is a wide range of balance assessment tests and a variety of commercial force plate systems to choose from, there is room for further improvement of the test details and evaluation variables of the balance assessment. The recommendations presented in this article are the foundation and key part of the postural balance assessment; these recommendations focus on the type of force plate, the subject’s foot posture, and the choice of assessment variables, which further enriches the content of posturography. In order to promote a more reasonable balance assessment method based on force plates, further methodological research and a stronger consensus are still needed.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 421
Author(s):  
Dariusz Puchala ◽  
Kamil Stokfiszewski ◽  
Mykhaylo Yatsymirskyy

In this paper, the authors analyze in more details an image encryption scheme, proposed by the authors in their earlier work, which preserves input image statistics and can be used in connection with the JPEG compression standard. The image encryption process takes advantage of fast linear transforms parametrized with private keys and is carried out prior to the compression stage in a way that does not alter those statistical characteristics of the input image that are crucial from the point of view of the subsequent compression. This feature makes the encryption process transparent to the compression stage and enables the JPEG algorithm to maintain its full compression capabilities even though it operates on the encrypted image data. The main advantage of the considered approach is the fact that the JPEG algorithm can be used without any modifications as a part of the encrypt-then-compress image processing framework. The paper includes a detailed mathematical model of the examined scheme allowing for theoretical analysis of the impact of the image encryption step on the effectiveness of the compression process. The combinatorial and statistical analysis of the encryption process is also included and it allows to evaluate its cryptographic strength. In addition, the paper considers several practical use-case scenarios with different characteristics of the compression and encryption stages. The final part of the paper contains the additional results of the experimental studies regarding general effectiveness of the presented scheme. The results show that for a wide range of compression ratios the considered scheme performs comparably to the JPEG algorithm alone, that is, without the encryption stage, in terms of the quality measures of reconstructed images. Moreover, the results of statistical analysis as well as those obtained with generally approved quality measures of image cryptographic systems, prove high strength and efficiency of the scheme’s encryption stage.


2018 ◽  
Vol 16 (2) ◽  
pp. 142-153 ◽  
Author(s):  
Kristen M Cunanan ◽  
Alexia Iasonos ◽  
Ronglai Shen ◽  
Mithat Gönen

Background: In the era of targeted therapies, clinical trials in oncology are rapidly evolving, wherein patients from multiple diseases are now enrolled and treated according to their genomic mutation(s). In such trials, known as basket trials, the different disease cohorts form the different baskets for inference. Several approaches have been proposed in the literature to efficiently use information from all baskets while simultaneously screening to find individual baskets where the drug works. Most proposed methods are developed in a Bayesian paradigm that requires specifying a prior distribution for a variance parameter, which controls the degree to which information is shared across baskets. Methods: A common approach used to capture the correlated binary endpoints across baskets is Bayesian hierarchical modeling. We evaluate a Bayesian adaptive design in the context of a non-randomized basket trial and investigate three popular prior specifications: an inverse-gamma prior on the basket-level variance, a uniform prior and half-t prior on the basket-level standard deviation. Results: From our simulation study, we can see that the inverse-gamma prior is highly sensitive to the input hyperparameters. When the prior mean value of the variance parameter is set to be near zero [Formula: see text], this can lead to unacceptably high false-positive rates [Formula: see text] in some scenarios. Thus, use of this prior requires a fully comprehensive sensitivity analysis before implementation. Alternatively, we see that a prior that places sufficient mass in the tail, such as the uniform or half-t prior, displays desirable and robust operating characteristics over a wide range of prior specifications, with the caveat that the upper bound of the uniform prior and the scale parameter of the half-t prior must be larger than 1. Conclusion: Based on the simulation results, we recommend that those involved in designing basket trials that implement hierarchical modeling avoid using a prior distribution that places a majority of the density mass near zero for the variance parameter. Priors with this property force the model to share information regardless of the true efficacy configuration of the baskets. Many commonly used inverse-gamma prior specifications have this undesirable property. We recommend to instead consider the more robust uniform prior or half-t prior on the standard deviation.


Sign in / Sign up

Export Citation Format

Share Document