congruence test
Recently Published Documents


TOTAL DOCUMENTS

14
(FIVE YEARS 2)

H-INDEX

4
(FIVE YEARS 0)

2021 ◽  
Vol 2114 (1) ◽  
pp. 012034
Author(s):  
M. W. Alhamd ◽  
Aqeel Maryoosh Jary ◽  
Sadeq Naeem Atiyah ◽  
Nazar Ali Abbood

Abstract In this research, entrance surface doses of patients which are resulted from a classical radiograph in most widespread tests (Chest, skull, abdomen, limbs, …) have been measured with selected instruments in (Specialized surgeries hospital) and one of the private clinic in Baghdad, the instruments are of various origins and different make date. A group of patients (10 patients) for each test and according the average of the resulted doses. Comparison of the patient’s doses, in this research, has been close with the reliable International standards and it has been found that radiological doses are bigger than reference doses ranging (132–1.79) in most of the appliance and the range doses to one whenever the instrument is new and the radiographer has good experience, this increase relates to several functions discussed in the research in detail For the importance of the quality assurance of x-ray instruments, three tests have been performed by three of the instruments only, and these tests are : Beam Alignment test : The Beam Alignment was measured and the x-ray radiograph was found symmetrical round the two axes of the instruments (A, B) but it is about 3 ° away from the vertical axis of the instrument (C). Optical and Radiation Field Congruence Test: The light field coincides with the radiative field in a and b and their mismatch in c. Focal Spot Size Test: The radiation focus area was measured by the star tool and what appeared is that the focus of the x-ray instrument (C) is smaller than the international standards on the contrary of the focus area of the x-ray instrument (E) which is identical with the international standards. From these results reached is that the instruments (A, B) have passed quality assurance tests and they are suitable for work in the present, but the instrument (C) has not passed most of the quality assurance tests, so this instrument should not be used for medical tests This in research is considered the first one for these instruments to evaluate their doses or measure a quality assurance.



2021 ◽  
pp. 002234332098265
Author(s):  
Valerie Sticher ◽  
Siniša Vuković

Research shows that conflict parties engage in ceasefires in pursuit of a variety of objectives, some of which reduce while others fuel violent conflict. This article provides a framework that links these objectives to a larger process. Building on bargaining theory, three distinct bargaining contexts are specified for intrastate conflicts. In the Diminishing Opponent context, leaders believe that a military solution yields a better outcome than a political settlement. In the Forcing Concessions context, they recognize the benefit of conflict settlement, but expectations about a mutually acceptable agreement still widely diverge. In the Enabling Agreement context, expectations converge, and leaders seek to pursue settlement without incurring further costs. In line with these readings, conflict party leaders adapt their strategic goal, from seeking to set up a military advantage, to boosting their bargaining power, to increasing the chances of a negotiated settlement. They may use ceasefires in the pursuit of any of these three goals, shifting the function of a ceasefire as they gain a better understanding of bargaining dynamics. A comparison of violence and ceasefire patterns in six contemporary peace processes and a congruence test conducted on the 2012–16 peace negotiations between the Colombian government and the guerilla organization FARC offer support for the theoretical framework. The findings highlight the important, and shifting, role ceasefires play in the transition from war to negotiated peace.



2020 ◽  
Vol 94 (12) ◽  
Author(s):  
Krzysztof Nowel

AbstractDeformation congruence models form the basis for conventional deformation analysis (CDA). In geometrical sense, these models connect an epochal object states—represented by its characteristic points—at stable/congruent points to disclose possible deformations. To this day, the deformation congruence models are usually specified using the global congruence test (GCT) procedure which, however, has a weakness in the case of multiple displacements. More precisely, the GCT procedure is based on consecutive point-by-point specification which may suffer from so-called displacement smearing. To overcome the above weakness, a revolutionary—in the context of GCT—concept (two methods) involving combinatorial possibilities was suggested in recent years. Admittedly, this concept avoids the problem of consecutive point-by-point specification. Nevertheless, it generates another weakness, namely the problem of the comparison of different-dimensional models. This paper makes a step forward in this new combinatorial field and discusses a more sophisticated combinatorial procedure, denoted as CIDIA. It was shown that, thanks to an appropriately used the possibilities of combinatorics and generalized likelihood ratio tests performed in the detection–identification–adaptation (DIA) iterative steps, the above weaknesses can be overcome. In the context of GCT, the suggested procedure has rather evolutionary—than revolutionary—character and the general concepts of both procedures have similar heuristic substantiation. To demonstrate the efficacy of CIDIA against GCT and the two existing combinatorial methods, various deformation scenarios were being randomized independently many times with the use of comprehensive computer simulations and then processed. Generally, the obtained results confirmed the statement that the suggested CIDIA procedure—unlike the existing combinatorial methods—can be substantially more resistant to displacement smearing than the GCT procedure, at no significant costs. The efficacy of CIDIA—unlike the ones of the two existing combinatorial methods—turned out always higher (on average by several percentages) than the one of GCT for all considered deformation scenarios. At the same time, the CIDIA procedure turned out substantially less time-consuming than the other combinatorial methods.



2020 ◽  
Vol 55 (4) ◽  
pp. 717-717
Author(s):  
Kermarrec Gaël ◽  
Kargoll Boris ◽  
Alkhatib Hamza


2020 ◽  
Vol 55 (3) ◽  
pp. 495-513 ◽  
Author(s):  
Kermarrec Gaël ◽  
Kargoll Boris ◽  
Alkhatib Hamza

AbstractThe detection of deformation is one of the major tasks in surveying engineering. It is meaningful only if the statistical significance of the distortions is correctly investigated, which often underlies a parametric modelization of the object under consideration. So-called regression B-spline approximation can be performed for point clouds of terrestrial laser scanners, allowing the setting of a specific congruence test based on the B-spline surfaces. Such tests are known to be strongly influenced by the underlying stochastic model chosen for the observation errors. The latter has to be correctly specified, which includes accounting for heteroscedasticity and correlations. In this contribution, we justify and make use of a parametric correlation model called the Matérn model to approximate the variance covariance matrix (VCM) of the residuals by performing their empirical mode decomposition. The VCM obtained is integrated into the computation of the congruence test statistics for a more trustworthy test decision. Using a real case study, we estimate the distribution of the test statistics with a bootstrap approach, where no parametric assumptions are made about the underlying population that generated the random sample. This procedure allows us to assess the impact of neglecting correlations on the critical value of the congruence test, highlighting their importance.



2020 ◽  
Vol 12 (3) ◽  
pp. 163-176
Author(s):  
Larisa F. Bayanova ◽  
Oleg G. Minyaev


2019 ◽  
Author(s):  
Larisa F. Bayanova ◽  
Oleg G. Minyaev


2016 ◽  
Vol 9 (4) ◽  
pp. 94-105
Author(s):  
Larisa F. Bayanova ◽  
Ekaterina A. Tsivilskaya ◽  
Roksana M. Bayramyan ◽  
Kirill S. Chulyukin


2014 ◽  
Vol 5 (3) ◽  
pp. 226-242 ◽  
Author(s):  
Carmelo Andújar ◽  
Víctor Soria-Carrasco ◽  
José Serrano ◽  
Jesús Gómez-Zurita


2014 ◽  
Vol 27 (2) ◽  
pp. 85 ◽  
Author(s):  
Lars Vogt

Popper’s falsificationism is frequently referred to as a general normative reference system in phylogenetics. Referring to falsificationism, phylogeneticists have made four central claims, including that frequency probabilities (1) cannot be used for inferring degrees of corroboration and (2) cannot be used in phylogenetics because phylogeny is a unique process, (3) likelihood methods represent verificationist approaches, and (4) the congruence test is a Popperian test. However, these claims are inconsistent with Popper’s theory. Moreover, phylogeneticists have proposed four strategies for dealing with the unfalsifiability of cladograms, including (1) interpreting re-interpretations of putative synapomorphy as homoplasy as Popperian ad hoc manoeuvres, (2) decoupling corroboration from falsification, (3) interpreting the tree with the highest likelihood as the most corroborated tree, and (4) interpreting tree hypotheses as Popperian probabilistic hypotheses that do not have to be falsifiable. These strategies are also inconsistent with Popper’s theory. Four fundamental problems and a problem with Popper’s formula for measuring degree of corroboration demonstrate that Popper’s theory does not live up to its own claims. Moreover, neither historical nor experimental sciences can be conducted in a way that is consistent with the principles of falsificationism. Therefore, phylogeneticists should stop referring to falsificationism when defending a specific methodological position.



Sign in / Sign up

Export Citation Format

Share Document