Principal Methods of Experimental Design and Statistical Analysis Used in Experimental Studies of Machining Processes

2011 ◽  
Vol 264-265 ◽  
pp. 919-924
Author(s):  
M. Villeta ◽  
B. de Agustina ◽  
J.M. Sáenz de Pipaón ◽  
E.M. Rubio

In practice, there are a lot of factors that can affect the expected final outcome of machining processes. That is why during last decades a great number of experimental studies have been made for improving this kind of processes. As the realization of the tests for taking into account factors such as machines, materials, tools and cutting conditions requires resources of different types and can provoke environmental damage at different levels, diverse methods of experimental design and statistical analysis have been developed looking for efficient experimentation. This paper is focused on the study of these statistical methods. It collects the principal types of experimental designs and statistical procedures used in experimental studies for the control and the improvement of machining processes. Additionally, the review and analysis of the main works on the machining processes area for each one of these statistical tools has been made.

Author(s):  
Peter McCormick

AbstractGiven the visibility and obvious importance of judicial power in the age of the Charter, it is important to develop the conceptual vocabulary for desribing and assessing this power. One such concept that has been applied to the study of appeal courts in the United States and Great Britain is “party capability”, a theory which suggests that different types of litigant will enjoy different levels of success as both appellant and respondent. Using a data base derived from the reported decisions of the provincial courts of appeal for the second and seventh year of each decade since the 1920s, this article applies party capability theory to the performance of the highest courts of the ten provinces; comparisons are attempted across regions and across time periods, as well as with the findings of similar studies of American and British courts.


1980 ◽  
Vol 43 (6) ◽  
pp. 1771-1792 ◽  
Author(s):  
K. O. Johnson

1. This paper and a following paper deal with problems, such as the following, that arise in experimental studies of the neural mechanisms underlying sensory discrimination: What measures of neural activity are relevant in such a study? How can sample data from the responses of single neurons be combined to represent the information relayed by a population of neurons? How can neural data be compared with results from psychophysical studies? What assumptions are implicit in any such comparison? What are the implications of assumptions that neurons respond independently or that they have homogeneous response properties? How can neural codes be assessed in a systematic way? Can psychophysical and neurophysiological observations be combined to infer mechanisms or relationships in the processes underlying discrimination? All of these questions require some theoretical framework before they can be answered. These papers set out such a framework, they deal with most of those questions, and they provide practicable formulas for relating sample data from neurophysiological experiments to behavioral measures derived from psychophysical experiments. 2. The processes that intervene between a relatively peripheral array of neural activity and a subject's decision in a discrimination task are split into two sections: a) the ascending sensory processes that provide the final patterns of neural activity on which discrimination is based, and b) a process that yields decisions of the type required by the experimental design used in the psychophysical study. The approach is to develop a theory of the decision process in this paper, and then to expand it to incorporate the ascending processes in the following paper. 3. The decision theory deals with a class of experimental designs in which a subject is required to make a decision about two stimuli S1 and S2 (e.g., S1 is larger than S2, S2 is the same as S1, S2 was the modified stimulus, and so on). A mathematical representation for experimental designs of this type is developed. 4. The decision process is analyzed in two forms: a) a multivariate form in which the discrimination decision results directly from multidimensional neural representation of the two stimuli, and b) a vivariate form in which the final representation of each stimulus is a unidimensional variable. Conditions required for equivalence of these formulations are examined. 5. The theory includes as explicit variables a) the experimental design, b) the subject's discrimination strategy, c) bias, d) memory variance, e) bias variance, f) variance in the final neural representations of the stimuli, and g) their functional dependence on the stimuli that they represent. 6. Formulas are developed for the expected values of commonly used psychophysical measures such as the classical psychometric function, receiver operating characteristic (ROC) functions, discriminatory separation index (d'), and the difference limen. 7. Optimum discrimination behavior is analyzed.


Author(s):  
Brianna N Gaskill ◽  
Joseph P Garner

The practical application of statistical power is becoming an increasingly important part of experimental design, data analysis, and reporting. Power is essential to estimating sample size as part of planning studies and obtaining ethical approval for them. Furthermore, power is essential for publishing and interpreting negative results. In this manuscript, we review what power is, how it can be calculated, and reporting recommendations if a null result is found. Power can be thought of as reflecting the signal to noise ratio of an experiment. The conventional wisdom that statistical power is driven by sample size (which increases the signal in the data), while true, is a misleading oversimplification. Relatively little discussion covers the use of experimental designs which control and reduce noise. Even small improvements in experimental design can achieve high power at much lower sample sizes than (for instance) a simple t test. Failure to report experimental design or the proposed statistical test on animal care and use protocols creates a dilemma for IACUCs, because it is unknown whether sample size has been correctly calculated. Traditional power calculations, which are primarily provided for animal number justifications, are only available for simple, yet low powered, experimental designs, such as paired t tests. Thus, in most controlled experimental studies, the only analyses for which power can be calculated are those that inheriently have low statistical power; these analyses should not be used because they require more animals than necessary. We provide suggestions for more powerful experimental designs (such as randomized block and factorial designs) that increase power, and we describe methods to easily calculate sample size for these designs that are suitable for IACUC number justifications. Finally we also provide recommendations for reporting negative results, so that readers and reviewers can determine whether an experiment had sufficient power. The use of more sophisticated designs in animal experiments will inevitably improve power, reproducibility, and reduce animal use.


2022 ◽  
Author(s):  
Eline Van Geert ◽  
Christophe Bossens ◽  
Johan Wagemans

Do individuals prefer stimuli that are ordered or chaotic, simple or complex, or that strike the right balance of order and complexity? Earlier research mainly focused on the separate influence of order and complexity on aesthetic appreciation. When order and complexity were studied in combination, stimulus manipulations were often not parametrically controlled, only rather specific types of order (i.e., balance or symmetry) were studied, and/or the multidimensionality of order and complexity was ignored. Progress has also been limited by the lack of an easy way to create reproducible and expandible stimulus sets, including both order and complexity manipulations. The Order & Complexity Toolbox for Aesthetics (OCTA), a Python toolbox that is also available as a point-and-click Shiny application, aims to fill this gap. OCTA provides researchers with a free and easy way to create multi-element displays varying qualitatively (i.e., different types) and quantitatively (i.e., different levels) in order and complexity, based on regularity and variety along multiple element features (e.g., shape, size, color, orientation). The standard vector-based output is ideal for experiments on the web and the creation of dynamic interfaces and stimuli. OCTA will not only facilitate reproducible stimulus construction and experimental design in research on order, complexity, and aesthetics. In addition, OCTA can be a very useful tool in any type of research using visual stimuli, or even to create digital art. To illustrate OCTA’s potential, we will propose several possible applications and diverse questions that can be addressed using OCTA.


2018 ◽  
Vol 16 (1) ◽  
pp. 112-119
Author(s):  
VLADIMIR GLEB NAYDONOV

The article considers the students’ tolerance as a spectrum of personal manifestations of respect, acceptance and correct understanding of the rich diversity of cultures of the world, values of others’ personality. The purpose of the study is to investgate education and the formation of tolerance among the students. We have compiled a training program to improve the level of tolerance for interethnic differences. Based on the statistical analysis of the data obtained, the most important values that are significant for different levels of tolerance were identified.


Author(s):  
Б.И. Гельцер ◽  
Э.В. Слабенко ◽  
Ю.В. Заяц ◽  
В.Н. Котельников

Одним из основных требований к разработке экспериментальных моделей цереброваскулярных заболеваний является их максимальная приближенность к реальной клинической практике. В работе систематизированы данные по основным методам моделирования острой ишемии головного мозга (ОИГМ), представлена их классификация, анализируются данные о преимуществах и недостатках той или иной модели. Обсуждаются результаты экспериментальных исследований по изучению патогенеза ОИГМ с использованием различных моделей (полной и неполной глобальной, локальной и мультифокальной ишемии) и способов их реализации (перевязка артерий, клипирование, коагуляция, эмболизация и др.). Особое внимание уделяется «стабильности» последствий острого нарушения мозгового кровообращения: необратимых ишемических повреждений головного мозга или обратимых с реперфузией заданной продолжительности. Отмечается, что важное значение в этих исследованиях должно принадлежать современным методам прижизненной визуализации очагов острого ишемического повреждения, что позволяет оценивать динамику патологического процесса. Предлагаемый метод отвечает требованиям гуманного обращения с животными. Подчеркивается, что выбор релевантной модели ОИГМ определяется задачами предстоящего исследования и технологическими ресурсами научной лаборатории. Development of experimental models for acute forms of cerebrovascular diseases is essential for implementation of methods for their prevention and treatment. One of the principal requirements to such models is their maximum approximation to actual clinical practice. This review systematized major models of acute cerebral ischemia (ACI), their classification, and presented information about their advantages and shortcomings. Also, the review presented results of experimental studies on pathophysiological mechanisms of different types of modeled ACI (complete and incomplete global, local, and multifocal ischemia) and methods for creating these models (arterial ligation, clipping, coagulation, embolization, etc.). Particular attention was paid to “stability” of the consequences of acutely impaired cerebral circulation - an irreversible ischemic brain injury or a reversible injury with reperfusion of a given duration. The authors emphasized that in such studies, a special significance should be given to intravital imaging of acute ischemic damage foci using modern methods, which allow assessing the dynamics of the pathological process and meet the requirements to humane treatment of animals. The choice of a relevant ACI model is determined by objectives of the planned study and the technological resources available at the research laboratory.


2019 ◽  
Author(s):  
Yasin Orooji ◽  
Fatemeh Noorisafa ◽  
Nahid Imami ◽  
Amir R. Chaharmahali

<p>Using experimental design and statistical analysis (½ Fractional Factorial Design), this study investigates the effect of different parameters in the membrane fabrication on the performance of nanocomposite PES/TiO<sub>2</sub> membrane. </p>


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 421
Author(s):  
Dariusz Puchala ◽  
Kamil Stokfiszewski ◽  
Mykhaylo Yatsymirskyy

In this paper, the authors analyze in more details an image encryption scheme, proposed by the authors in their earlier work, which preserves input image statistics and can be used in connection with the JPEG compression standard. The image encryption process takes advantage of fast linear transforms parametrized with private keys and is carried out prior to the compression stage in a way that does not alter those statistical characteristics of the input image that are crucial from the point of view of the subsequent compression. This feature makes the encryption process transparent to the compression stage and enables the JPEG algorithm to maintain its full compression capabilities even though it operates on the encrypted image data. The main advantage of the considered approach is the fact that the JPEG algorithm can be used without any modifications as a part of the encrypt-then-compress image processing framework. The paper includes a detailed mathematical model of the examined scheme allowing for theoretical analysis of the impact of the image encryption step on the effectiveness of the compression process. The combinatorial and statistical analysis of the encryption process is also included and it allows to evaluate its cryptographic strength. In addition, the paper considers several practical use-case scenarios with different characteristics of the compression and encryption stages. The final part of the paper contains the additional results of the experimental studies regarding general effectiveness of the presented scheme. The results show that for a wide range of compression ratios the considered scheme performs comparably to the JPEG algorithm alone, that is, without the encryption stage, in terms of the quality measures of reconstructed images. Moreover, the results of statistical analysis as well as those obtained with generally approved quality measures of image cryptographic systems, prove high strength and efficiency of the scheme’s encryption stage.


Molecules ◽  
2021 ◽  
Vol 26 (9) ◽  
pp. 2506
Author(s):  
Wamidh H. Talib ◽  
Ahmad Riyad Alsayed ◽  
Alaa Abuawad ◽  
Safa Daoud ◽  
Asma Ismail Mahmod

Melatonin is a pleotropic molecule with numerous biological activities. Epidemiological and experimental studies have documented that melatonin could inhibit different types of cancer in vitro and in vivo. Results showed the involvement of melatonin in different anticancer mechanisms including apoptosis induction, cell proliferation inhibition, reduction in tumor growth and metastases, reduction in the side effects associated with chemotherapy and radiotherapy, decreasing drug resistance in cancer therapy, and augmentation of the therapeutic effects of conventional anticancer therapies. Clinical trials revealed that melatonin is an effective adjuvant drug to all conventional therapies. This review summarized melatonin biosynthesis, availability from natural sources, metabolism, bioavailability, anticancer mechanisms of melatonin, its use in clinical trials, and pharmaceutical formulation. Studies discussed in this review will provide a solid foundation for researchers and physicians to design and develop new therapies to treat and prevent cancer using melatonin.


Sign in / Sign up

Export Citation Format

Share Document