VERIFICATION OF DEPENDENCES APPROXIMATING THE DIAGRAMS OF DEFORMATION OF CEMENT AND POLYMER CONCRETE BY THE METHOD OF NORMALIZED INDICATORS

2021 ◽  
Vol 93 (1) ◽  
pp. 125-133
Author(s):  
V.P. SELYAEV ◽  
◽  
P.V. SELYAEV ◽  
S.YU. GRYAZNOV ◽  
D.R. BABUSHKINA ◽  
...  

The article verifies some approximating power-law and hyperbolic dependences between stresses σ and deformations ε for experimental deformation diagrams of cement concrete and polymer concrete. When analyzing the state and residual life of reinforced concrete structures, one has to solve the problem of determining the relationship between stresses and deformations in various design sections of structures. The traditional approach, based on the selection of the approximating function "σ – ε" from the numerical values of the deformation diagram obtained by testing samples (cubes, prisms, cylinders), is practically impossible. Therefore, an alternative approach is proposed based on the selection of an approximating function according to standardized indicators: ultimate strength (σ_bu); modulus of elasticity (E_b0); ultimate deformation (ε_bu). The numerical values of the normalized indicators can be determined at a given point by analyzing the results of indentation of the indenter into the material of structures. As approximating ones, consider the power functions that are most preferable for materials with a fractal structure. Various boundary conditions are considered for determining the constant coefficients α and β according to the system of normalized indicators. The graphs of changes in tangent modules are analyzed.

2016 ◽  
Vol 62 (7) ◽  
pp. 959-965 ◽  
Author(s):  
Martín Yago ◽  
Silvia Alcover

Abstract BACKGROUND According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. METHODS We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. RESULTS A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). CONCLUSIONS The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management.


1979 ◽  
Vol 25 (6) ◽  
pp. 863-869 ◽  
Author(s):  
J O Westgard ◽  
T Groth

Abstract We have studied power functions for several control rules by use of a computer simulation program. These power functions show the relationship between the probability for rejection and the size of the analytical errors that are to be detected. They allow some assessment of the quality available from present statistical control systems and provide some guidance in the selection of control rules and numbers of control observations when new control systems are designed.


1996 ◽  
Vol 25 (3) ◽  
pp. 267-276 ◽  
Author(s):  
Gary B. Brumback

The article begins with a brief review of the traditional approach to screening people on their ethical dimension. Next, the traditional approach to selection in general is critiqued, and a model of an alternative approach is presented. The article closes with a checklist of suggestions on how to recruit and select people in an ethical manner. A few years ago in this journal, the elements of a strategy for putting ethics back to work in government were proposed.1 One of them is the Maginot line of defense against wrongdoing, the selection of the right people in the first place. They are quality people who are more consistently competent, committed, and ethical. Without them, an organization cannot hope to achieve total quality performance, or the right results achieved in the right ways. Traditional pre-employment screening on the ethical dimension is briefly reviewed and evaluated first. A non-traditional model is then presented for screening people holistically on all quality dimensions. The article concludes with suggestions for recruiting and selecting in an ethical manner.


1994 ◽  
Vol 50 ◽  
pp. 111-121 ◽  
Author(s):  
Peter Groot

This article deals with an alternative approach to the problem of the selection of words in foreign language teaching at higher levels. Since frequency as a selection criterion is inadequate beyond the first 2.500 words, it is argued that the question as to how many and which words can only be answered on the basis of quantitative data on the relationship between lexical coverage and text comprehen-sion. Only after establishing what coverage goes with what level of comprehension will it be possible to come forward with valid suggestions concerning the number of words required for reading authentic L2 texts. As to which words, it is argued that these should be selected from a corpus of 10.000 words (beyond the first 2.500) to be constructed on the basis of a combination of selection criteria such as frequency, valency etc. Knowledge of any 5.000 words from this corpus (combined with the first 2.500) will yield such a dense coverage (at least 95%) of general L2 texts that the meaning of any remaining unknown words can be deduced via contextual clues utilisation. To substantiate this claim the results of some experimental investigations of the relationship between coverage and text comprehension are reported.


Author(s):  
А. I. Grabovets ◽  
V. P. Kadushkina ◽  
S. А. Kovalenko

With the growing aridity of the climate on the Don, it became necessary to improve the methodology for conducting the  breeding of spring durum wheat. The main method of obtaining the source material remains intraspecific step hybridization. Crossings were performed between genetically distant forms, differing in origin and required traits and properties. The use of chemical mutagenesis was a productive way to change the heredity of genotypes in terms of drought tolerance. When breeding for productivity, both in dry years of research and in favorable years, the most objective markers were identified — the size of the aerial mass, the mass of grain per plant, spike, and harvest index. The magnitude of the correlation coefficients between the yield per unit area and the elements of its structure is established. It was most closely associated with them in dry years, while in wet years it decreased. Power the correlation of the characteristics of the pair - the grain yield per square meter - the aboveground biomass averaged r = 0.73, and in dry years it was higher (0.91) than in favorable ones (0.61 - 0.70) , between the harvest and the harvest index - r = 0.81 (on average). In dry years, the correlation coefficient increased to 0.92. Research data confirms the greatest importance of the mass of grain from one ear and the plant in the formation of grain yield per unit area in both dry and wet years. In dry years, the correlation coefficient between yield and grain mass per plant was on average r = 0.80; in favorable years, r = 0.69. The relationship between yield and grain mass from the ear was greater — r = 0.84 and r = 0.82, respectively. Consequently, the breeding significance of the aboveground mass and the productivity of the ear, as a criterion for the selection of the crop, especially increases in the dry years. They were basic in the selection.


2013 ◽  
Vol 5 (2) ◽  
pp. 48-53
Author(s):  
William Aprilius ◽  
Lorentzo Augustino ◽  
Ong Yeremia M. H.

University Course Timetabling Problem is a problem faced by every university, one of which is Universitas Multimedia Nusantara. Timetabling process is done by allocating time and space so that the whole associated class and course can be implemented. In this paper, the problem will be solved by using MAX-MIN Ant System Algorithm. This algorithm is an alternative approach to ant colony optimization. This algorithm uses two tables of pheromones as stigmergy, i.e. timeslot pheromone table and room pheromone table. In addition, the selection of timeslot and room is done by using the standard deviation of the value of pheromones. Testing is carried out by using 105 events, 45 timeslots, and 3 types of categories based on the number of rooms provided, i.e. large, medium, and small. In each category, testing is performed 5 times and for each testing, the data recorded is the unplace and Soft Constraint Penalty. In general, the greater the number of rooms, the smaller the unplace. Index Terms—ant colony optimization, max-min ant system, timetabling


2018 ◽  
Vol 2 (2) ◽  
pp. 137
Author(s):  
Muhammad Abi Berkah Nadi

Radin Inten II Airport is a national flight in Lampung Province. In this study using the technical analysis stated preference which is the approach by conveying the choice statement in the form of hypotheses to be assessed by the respondent. By using these techniques the researcher can fully control the hypothesized factors. To determine utility function for model forecasting in fulfilling request of traveler is used regression analysis with SPSS program. The analysis results obtained that the passengers of the dominant airport in the selection of modes of cost attributes than on other attributes. From the result of regression analysis, the influence of independent variable to the highest dependent variable is when the five attributes are used together with the R square value of 8.8%. The relationship between cost, time, headway, time acces and service with the selection of modes, the provision that states whether or not there is a decision. The significance of α = 0.05 with chi-square. And the result of Crame's V test average of 0.298 is around the middle, then the relationship is moderate enough.


Communicology ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 138-148
Author(s):  
NATALIA MALSHINA ◽  

This study examines the ontological problems in the aspect of the ratio of different cognitive practices and their mutual conditionality in the context of communication and their socio-cultural prerequisites, which is possible only if the traditional approach to the distinction between epistemology and faith is revised. Based on the idea of identity of common grounds of cognitive practices “belief” is included in the understanding of interpretation in the communicative situation for true knowledge in each of the modes of being. Belief in the philosophical tradition reveals the ontological foundations of hermeneutics. Three reflections are synthesised: the hermeneutic concept of understanding, the structuralist concept of language, and the psychoanalytic concept of personality. It is necessary to apply the method of phenomenological reduction to the ontological substantiation of hermeneutics in the Christian Orthodox tradition. Hence, the very natural seems the meeting of semantics, linguistics, and onomatodoxy, with the ontology language of Heidegger, the origins of which resides in in Husserl phenomenology. Fundamental ontology and linguistics, cult philosophy - both in different ways open the horizons of substantiation of hermeneutics. The beginning of this justification is the hermeneutic problem in Christianity, which has appeared as a sequence of the question of the relationship between the two Covenants, or two Unions. In the paper, the author attempts to identify the stages of constructing the philosophical concept of Pavel Florensky. As a result, the substantiation of the birth of the world in consciousness by the cult is revealed. Ontological tradenote words can be seen in Florensky through symbols. The symbol makes the transition from a small energy to a larger one, from a small information saturation to a greater one, acting as a lumen of being - when by the name we hear the reality. The word comes into contact with the world that is on the other side of our own psychological state. The word, the symbol shifts all the time from subjective to objective. The communicative model acts as a common point uniting these traditions. The religious approach as part of semiotic approach reveals the horizons of ontological conditionality of language and words, and among the words - the name, as the name plays a central role in the accumulation and transmission of information, understanding of the commonality of this conditionality in the concepts of phenomenology and Christian, Orthodox tradition.


2018 ◽  
Vol 15 (5) ◽  
pp. 429-442 ◽  
Author(s):  
Nishant Verma ◽  
S. Natasha Beretvas ◽  
Belen Pascual ◽  
Joseph C. Masdeu ◽  
Mia K. Markey ◽  
...  

Background: Combining optimized cognitive (Alzheimer's Disease Assessment Scale- Cognitive subscale, ADAS-Cog) and atrophy markers of Alzheimer's disease for tracking progression in clinical trials may provide greater sensitivity than currently used methods, which have yielded negative results in multiple recent trials. Furthermore, it is critical to clarify the relationship among the subcomponents yielded by cognitive and imaging testing, to address the symptomatic and anatomical variability of Alzheimer's disease. Method: Using latent variable analysis, we thoroughly investigated the relationship between cognitive impairment, as assessed on the ADAS-Cog, and cerebral atrophy. A biomarker was developed for Alzheimer's clinical trials that combines cognitive and atrophy markers. Results: Atrophy within specific brain regions was found to be closely related with impairment in cognitive domains of memory, language, and praxis. The proposed biomarker showed significantly better sensitivity in tracking progression of cognitive impairment than the ADAS-Cog in simulated trials and a real world problem. The biomarker also improved the selection of MCI patients (78.8±4.9% specificity at 80% sensitivity) that will evolve to Alzheimer's disease for clinical trials. Conclusion: The proposed biomarker provides a boost to the efficacy of clinical trials focused in the mild cognitive impairment (MCI) stage by significantly improving the sensitivity to detect treatment effects and improving the selection of MCI patients that will evolve to Alzheimer’s disease.


Author(s):  
A. A. Sheptulin ◽  
O. A. Storonova

Aim of review. Interpretation of published evidence on the relationship between the excessive belching syndrome and functional dyspepsia (FD) and their management in patient care.Key points. According to the Rome IV criteria of functional gastrointestinal disorders (FGID), excessive belching in the absence of other dyspeptic symptoms is to be considered a manifestation of the excessive belching syndrome, which can be of gastric or supragastric nature. Combination of high-resolution manometry and impedancemetry allows an accurate diagnosis of belching and selection of optimal treatment strategy. Belching complicated by other symptoms of dyspepsia is to be considered yet another FD symptom according to the Rome IV criteria of FGID. Prokinetics are recommended to relieve belching in such cases.Conclusion. Understanding the relationship between the excessive belching syndrome and FD requires further research.


Sign in / Sign up

Export Citation Format

Share Document