statistical concept
Recently Published Documents


TOTAL DOCUMENTS

83
(FIVE YEARS 23)

H-INDEX

10
(FIVE YEARS 0)

2022 ◽  
Vol 24 (1) ◽  
pp. 273-287
Author(s):  
Nikita R.Nikam ◽  
◽  
Yogita M. Kolekar ◽  

Some ancient medications were used to make the hair care herbal shampoo powder. Organoglytics, powder characteristics, foam test, and physical evaluation were performed on Tulsi, Shikakai, Heena, Bahera, Amla, Neem, and Brahmi. Existing inspections will assist set standards and assessment criteria, which will undoubtedly aid to standardise the quality and purity of these herbal powder shampoos, due to the selection of drugs once the drugs are used together or jointly. We optimise the formula with the help of the Design of Experiments as per the Quality by Design approach. This paper illustrates broad theoretical as well as practical view of advanced screening design. In addition to the statistical concept‟s regression analysis, parato chart, residual diagnosis, main effect plot, interaction effect plot, design space and multiple response prediction.


2021 ◽  
Vol 2 (2) ◽  
pp. 89-97
Author(s):  
Sandheep Sugathan ◽  
Lilli Jacob

   Background: To describe various measures for estimation of effect size, how it can be calculated and the scenarios in which each measures of effect size can be applied.  Methods: The researchers can display the effect size measures in research articles which evaluate the difference between the means of continuous variables in different groups or the difference in proportions of outcomes in different groups of individuals. When p-value alone is displayed in a research article, without mentioning the effect size, reader may not get the correct pictures regarding the effect or role of independent variable on the outcome variable.  Results: Effect size is a statistical concept that measures the actual difference between the groups or the strength of the relationship between two variables on a numeric scale.  Conclusion: Effect size measures in scientific publications can communicate the actual difference between groups or the estimate of association between the variables, not just if the association or difference is statistically significant. The researchers can make their findings more interpretable, by displaying a suitable measure of effect size. Effect size measure can help the researchers to do meta-analysis by combining the data from multiple research articles. 


2021 ◽  
Vol 3 (3) ◽  
Author(s):  
Pim Cuijpers

Background Most meta-analyses use the ‘standardised mean difference’ (effect size) to summarise the outcomes of studies. However, the effect size has important limitations that need to be considered. Method After a brief explanation of the standardized mean difference, limitations are discussed and possible solutions in the context of meta-analyses are suggested. Results When using the effect size, three major limitations have to be considered. First, the effect size is still a statistical concept and small effect sizes may have considerable clinical meaning while large effect sizes may not. Second, specific assumptions of the effect size may not be correct. Third, and most importantly, it is very difficult to explain what the meaning of the effect size is to non-researchers. As possible solutions, the use of the ‘binomial effect size display’ and the number-needed-to-treat are discussed. Furthermore, I suggest the use of binary outcomes, which are often easier to understand. However, it is not clear what the best binary outcome is for continuous outcomes. Conclusion The effect size is still useful, as long as the limitations are understood and also binary outcomes are given.


Author(s):  
Luke J. Zachmann ◽  
Erin M. Borgman ◽  
Dana L. Witwicki ◽  
Megan C. Swan ◽  
Cheryl McIntyre ◽  
...  

AbstractWe describe the application of Bayesian hierarchical models to the analysis of data from long-term, environmental monitoring programs. The goal of these ongoing programs is to understand status and trend in natural resources. Data are usually collected using complex sampling designs including stratification, revisit schedules, finite populations, unequal probabilities of inclusion of sample units, and censored observations. Complex designs intentionally create data that are missing from the complete data that could theoretically be obtained. This “missingness” cannot be ignored in analysis. Data collected by monitoring programs have traditionally been analyzed using the design-based Horvitz–Thompson estimator to obtain point estimates of means and variances over time. However, Horvitz–Thompson point estimates are not capable of supporting inference on temporal trend or the predictor variables that might explain trend, which instead requires model-based inference. The key to applying model-based inference to data arising from complex designs is to include information about the sampling design in the analysis. The statistical concept of ignorability provides a theoretical foundation for meeting this requirement. We show how Bayesian hierarchical models provide a general framework supporting inference on status and trend using data from the National Park Service Inventory and Monitoring Program as examples. Supplemental Materials Code and data for implementing the analyses described here can be accessed here: https://doi.org/10.36967/code-2287025.


2021 ◽  
Vol 8 (9) ◽  
pp. 116-132
Author(s):  
Carl H. D. Steinmetz

This article answers the question, "is the use of the words inclusion and diversity an expression of institutional racism?" In almost all Western countries, immigrants and refugees barely penetrate all levels of organizations. Immigrants and refugees are mostly found in the lower echelons of an organization. To put it irreverently: the dirtier and heavier the work, the more immigrants and refugees are found there. Also in governments and parliaments immigrants (not even the second, third and fourth generation) are hardly to be found. So the good example is lacking. This article starts with an etymological examination of inclusion and diversity. The outcome is briefly summarized: " we want YOU -immigrant and/or refugee- to come and work for us because we are not allowed to hire only natives". That human rights are violated in this way does not seem to be an issue. Furthermore, we argue that it is precisely the words inclusion and diversity that prevent the recruitment of immigrants and refugees, as well as expats who have lost their jobs, from being given a high priority. This article proposes two new terms as just and equitable alternatives to inclusion and diversity. First, the statistical concept of representative and second, in support of the static concept of representative, the concept of wanting to be a mirror of the population from the neighbourhood, city or country that the leadership of the company or institution believes should be part of the work organization and from low to high. To further support this argument to replace the concepts of inclusion and diversity, Moscovici's (2001) concept of social representation is used. This article also looks at existing toolboxes and toolkits that Western countries have developed to ensure that organizations of governments, institutions independent of government, and businesses are representative of neighbourhoods, cities, and countries. Companies in particular are committed to this because they understand better than anyone that their paying customers are also immigrants, refugees and expats. This knowledge of toolboxes and toolkits was helpful in developing a guideline for organizations of governments, institutions independent of government and companies and therefore also parliaments and governments. The guideline also addresses violations of this guideline. To address violations, it proposes a self-learning model for teams in organizations that is also consistent with enforcing the Working Conditions Act in Western countries.


Author(s):  
Mudit M. Saxena

Six Sigma is a methodology for process improvement as well as a statistical concept that looks for to determine the variation intrinsic in any process. Six Sigma represents process, that is having 3.4 defects per million opportunities. i.e. 99.99966 % of the products from a Six Sigma process are perfect. Firms can impact their sigma level by combining main principles from the Six Sigma methodology into leadership styles, process management, and improvement activities. Main principle of the technique is a focus on the customer. There are many challenges in the implementation of Six Sigma. A well-run manufacturing team can make the entire firm more successful through cost-saving measures, increased quality and a larger inventory of products that the company can market. The Six Sigma objective is to make sure the process has minimum defects(3.4 defects per million chances). Every aspect of the process must be carefully planned and documented in detail in order for manufacturing to go efficiently. The main aspect of Six Sigma for enhancement in the manufacturing industry is to maximize the financial returns.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5036
Author(s):  
Marius Minea ◽  
Cătălin Marian Dumitrescu ◽  
Viviana Laetitia Minea

The article presents a research in the field of complex sensing, detection, and recovery of communications networks applications and hardware, in case of failures, maloperations, or unauthorized intrusions. A case study, based on Davis AI engine operation versus human maintenance operation is performed on the efficiency of artificial intelligence agents in detecting faulty operation, in the context of growing complexity of communications networks, and the perspective of future development of internet of things, big data, smart cities, and connected vehicles. (*). In the second part of the article, a new solution is proposed for the detection of applications faults or unauthorized intrusions in traffic of communications networks. The first objective of the proposed method is to propose an approach for predicting time series. This approach is based on a multi-resolution decomposition of the signals employing the undecimate wavelet transform (UWT). The second approach for assessing traffic flow is based on the analysis of long-range dependence (LRD) (for this case, a long-term dependence). Estimating the degree of long-range dependence is performed by estimating the Hurst parameter of the analyzed time series. This is a relatively new statistical concept in communications traffic analysis and can be implemented using UWT. This property has important implications for network performance, design, and sizing. The presence of long-range dependency in network traffic is assumed to have a significant impact on network performance, and the occurrence of LRD can be the result of faults that occur during certain periods. The strategy chosen for this purpose is based on long-term dependence on traffic, and for the prediction of faults occurrence, a predictive control model (MPC) is proposed, combined with a neural network with radial function (RBF). It is demonstrated via simulations that, in the case of communications traffic, time location is the most important feature of the proposed algorithm.


2021 ◽  
Vol 9 ◽  
Author(s):  
Alexandre Dunant

This paper presents a generalization of the bias-variance tradeoff applied to the recent trend toward natural multi-hazard risk assessment. The bias-variance dilemma, a well-known machine learning theory, is presented in the context of natural hazard modeling. It is then argued that the bias-variance statistical concept can provide an analytical framework for the necessity to direct efforts toward systemic risk assessment using multi-hazard catastrophe modeling and inform future mitigation practices.


2021 ◽  
Author(s):  
Thuan Vo Van

Abstract The recent state-of-the-art double-slit experiments with single electrons and single photons seem to emphasize contradictable dilemma concerning the ontological physical reality in quantum physics. Because of the importance of this problem, we propose and perform another modified laser-beam asymmetrical double-slit experiment. In the results, a Feynman condition with closing mask allows to assess qualitatively the interference contributions of photons passing through one or another slit. Moreover, a definite "which-way" phenomenon has been identified with a high experimental confidence. This would be the simplest way without any disturbance of the photon beam to observe simultaneously both their path and momentum in consistency with the quantum statistical concept.


2021 ◽  
Vol 12 ◽  
Author(s):  
Jonathan P. R. Scott ◽  
Andreas Kramer ◽  
Nora Petersen ◽  
David A. Green

Exposure to the spaceflight environment results in profound multi-system physiological adaptations in which there appears to be substantial inter-individual variability (IV) between crewmembers. However, performance of countermeasure exercise renders it impossible to separate the effects of the spaceflight environment alone from those associated with exercise, whilst differences in exercise programs, spaceflight operations constraints, and environmental factors further complicate the interpretation of IV. In contrast, long-term head-down bed rest (HDBR) studies isolate (by means of a control group) the effects of mechanical unloading from those associated with countermeasures and control many of the factors that may contribute to IV. In this perspective, we review the available evidence of IV in response to the spaceflight environment and discuss factors that complicate its interpretation. We present individual data from two 60-d HDBR studies that demonstrate that, despite the highly standardized experimental conditions, marked quantitative differences still exist in the response of the cardiorespiratory and musculoskeletal systems between individuals. We also discuss the statistical concept of “true” and “false” individual differences and its potential application to HDBR data. We contend that it is currently not possible to evaluate IV in response to the spaceflight environment and countermeasure exercise. However, with highly standardized experimental conditions and the presence of a control group, HDBR is suitable for the investigation of IV in the physiological responses to gravitational unloading and countermeasures. Such investigations may provide valuable insights into the potential role of IV in adaptations to the spaceflight environment and the effectiveness of current and future countermeasures.


Sign in / Sign up

Export Citation Format

Share Document