qualitative estimate
Recently Published Documents


TOTAL DOCUMENTS

28
(FIVE YEARS 6)

H-INDEX

9
(FIVE YEARS 1)

Materials ◽  
2021 ◽  
Vol 14 (24) ◽  
pp. 7892
Author(s):  
Servando Chinchón-Payá ◽  
Julio E. Torres Martín ◽  
Antonio Silva Toledo ◽  
Javier Sánchez Montero

A correct assessment of the pathologies that can affect a reinforced concrete structure is required in order to define the repair procedure. This work addresses the challenge of quantifying chlorides and sulphates directly on the surface of concrete. The quantification was carried out by means of X-ray fluorescence analysis on the surface of concrete specimens at different points with portable equipment. Concrete prisms were made with different amounts of NaCl and Na2SO4. To avoid the influence of coarse aggregate, a qualitative estimate of the amount of coarse aggregate analyzed has been made, although the results show that there is no significant influence. Monte Carlo simulations were carried out in order to establish the necessary number of random analyses of the mean value to be within an acceptable range of error. In the case of quantifying sulphates, it is necessary to carry out six random analyses on the surface, and eight measurements in the case of quantifying chlorides; in this way, it is ensured that errors are below 10% in 95% of the cases. The results of the study highlight that a portable XRF device can be used in situ to obtain concentrations of chlorides and sulphates of a concrete surface with good accuracy. There is no need to take samples and bring them to a laboratory, allowing lower overall costs in inspection and reparation works.


Physics ◽  
2021 ◽  
Vol 3 (3) ◽  
pp. 563-568
Author(s):  
Boris S. Murygin ◽  
Alexander A. Kirillov ◽  
Valery V. Nikulin

Production of domain walls and string-like solitons in the model with two real scalar fields and potential with at least one saddle point and a local maximum is considered. The model is regarded as 2-dimensional spatial slices of 3-dimensional entire structures. It is shown that, in the early Universe, both types of solitons may appear. In addition, the qualitative estimate of the domain walls and strings formation probabilities is presented. It is found that the probability of the formation of string-like solitons is suppressed compared to that of domain walls.


2020 ◽  
pp. 47-53
Author(s):  
И.И. Лебедев ◽  
В.Н. Невский

Предметом исследования являются опасные геоморфологические процессы на берегах островов Русский и Шкота (залив Петра Великого, Японское море). Цель работы – крупномасштабное картографирование морфогенетических типов берегов и качественная оценка потенциальной опасности проявления этих процессов. В результате крупномасштабного картографирования на острове Русский выделено 4 морфогенетических типа и 9 подтипов берегов, на острове Шкота – 4 типа и 8 подтипов. Картографическая легенда отражает степень переработки берегов волноприбойными процессами. Основными руководящими признаками выделения категорий являются морфология берегового уступа, а также морфология пляжа и размерность слагающего его материала. Наиболее опасными для населения и инженерных сооружений являются абразионные берега с обрывистым береговым уступом и узким, преимущественно валунным пляжем (например, восточные побережья обоих островов). Опасность в данном случае понимается как высокая вероятность трех геоморфологических событий – обвалов, оползней и быстрого подтопления пляжей и прилегающих территорий в результате штормового нагона и цунами. Степень опасности геоморфологических процессов в береговой зоне следует учитывать при проведении функционального зонирования территории этих островов в целях градостроительного проектирования. The subject of the research is dangerous geomorphic processes on the shores of the Russkii and Shkota islands (Peter the Great Bay, the Sea of Japan). The purpose of this work is large-scale mapping of morphogenetic types of coasts and the qualitative estimate of the potential hazard of these processes. As a result of large-scale mapping, four morphogenetic types and 12 subtypes of coasts were identified on Russkii Island, and four types and eight subtypes on Shkota Island. The cartographic legend reflects the degree of coast transformation by marine processes. The main guiding signs for identifying categories are the morphology of the cliff (coast scarp), the morphology of the beach, and the dimension of its material. It was concluded that the abrasion shores with a steep (precipitous) coastal cliff and a narrow, mainly boulder beach (for example, the eastern coasts of the both islands) were most dangerous for the population and engineering buildings. The danger in this case is understood as the high probability of three geomorphic events – rockfalls, landslides, and rapid flooding of beaches and adjacent territories as a result of a storm surge and tsunami. For functional zoning of these islands we must take into account the degree of danger of geomorphic processes in the coastal zone.


GigaScience ◽  
2020 ◽  
Vol 9 (11) ◽  
Author(s):  
Sergey E Golovenkin ◽  
Jonathan Bac ◽  
Alexander Chervov ◽  
Evgeny M Mirkes ◽  
Yuliya V Orlova ◽  
...  

Abstract Background Large observational clinical datasets are becoming increasingly available for mining associations between various disease traits and administered therapy. These datasets can be considered as representations of the landscape of all possible disease conditions, in which a concrete disease state develops through stereotypical routes, characterized by “points of no return" and “final states" (such as lethal or recovery states). Extracting this information directly from the data remains challenging, especially in the case of synchronic (with a short-term follow-up) observations. Results Here we suggest a semi-supervised methodology for the analysis of large clinical datasets, characterized by mixed data types and missing values, through modeling the geometrical data structure as a bouquet of bifurcating clinical trajectories. The methodology is based on application of elastic principal graphs, which can address simultaneously the tasks of dimensionality reduction, data visualization, clustering, feature selection, and quantifying the geodesic distances (pseudo-time) in partially ordered sequences of observations. The methodology allows a patient to be positioned on a particular clinical trajectory (pathological scenario) and the degree of progression along it to be characterized with a qualitative estimate of the uncertainty of the prognosis. We developed a tool ClinTrajan for clinical trajectory analysis implemented in the Python programming language. We test the methodology in 2 large publicly available datasets: myocardial infarction complications and readmission of diabetic patients data. Conclusions Our pseudo-time quantification-based approach makes it possible to apply the methods developed for dynamical disease phenotyping and illness trajectory analysis (diachronic data analysis) to synchronic observational data.


2019 ◽  
Vol 2019 (10) ◽  
pp. 4-10 ◽  
Author(s):  
Алексей Родичев ◽  
Aleksey Rodichev ◽  
Андрей Горин ◽  
Andrey Gorin ◽  
Мария Токмакова ◽  
...  

The paper is dedicated to the investigation of adhesion strength in film coatings. The goal of the work is a qualitative estimate of the strength of film antifriction coatings applied on friction units of machines. The topicality is grounded by the fact that the friction decrease on surfaces in units ensures the increase of life, power effectiveness and reliability of machinery. Most of the works are dedicated to the study of manufacturing films, their phase composition, structure, stress-strain properties, whereas in the material offered the adhesion properties of film antifriction coating are emphasized, which is a sufficient novelty. During the investigations of the adhesion strength of antifriction film coatings there was used a method of a normal detachment. To carry out comparative tests there were prepared four groups of different antifriction coatings. For each coating under testing there were prepared five samples which were loaded with the smooth increase of a breaking tension. Therefore there are obtained experimental dependences of normal stresses (σ) upon relative elongation (ε), which allow using values obtained for surfaces coated with antifriction film coatings with different areas. Upon completion of coating tests both parts of the samples tested were subjected to visual analysis. In the course of the analysis a character of destruction was defined. As a result of the theoretical and experimental investigations carried out there were drawn conclusions describing a qualitative and quantitative estimate of the adhesion strength of antifriction film coating.


2015 ◽  
Vol 773 ◽  
pp. 75-102 ◽  
Author(s):  
E. S. Benilov ◽  
M. S. Benilov

We examine two- and three-dimensional drops steadily sliding down an inclined plate. The contact line of the drop is governed by a model based on the Navier-slip boundary condition and a prescribed value for the contact angle. The drop is thin, so the lubrication approximation can be used. In the three-dimensional case, we also assume that the drop is sufficiently small (its size is smaller than the capillary scale). These assumptions enable us to determine the shape of the drop and derive an asymptotic expression for its velocity. For three-dimensional drops, this expression is matched to a qualitative estimate of Kim et al. (J. Colloid Interface Sci., vol. 247, 2002, pp. 372–380) obtained for arbitrary drops, i.e. not necessarily thin and small. The matching fixes an undetermined coefficient in Kim, Lee and Kang’s estimate, turning it into a quantitative result.


F1000Research ◽  
2015 ◽  
Vol 3 ◽  
pp. 187
Author(s):  
Eric G. Smith

Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome.Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding.Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach).Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output.


F1000Research ◽  
2014 ◽  
Vol 3 ◽  
pp. 187 ◽  
Author(s):  
Eric G. Smith

Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome.Results:  A hypothetical example is provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding.Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) assessing the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach).Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output.


2012 ◽  
Vol 189 ◽  
pp. 453-456
Author(s):  
Su Ping You

The computer simulation methods have become more and more widely used technique in materials science research. This paper discusses the importance of the application of computer simulation in the field of materials science. There are many application areas of computer simulation in materials science including the heat treatment, material microstructure, corrosion and protection, casting, and material design areas. It can be seen from the results of these applications that computer simulation technology is an efficient, realistic way to fully reflect the variation of the sample in a variety of research process. It can get rid of a rough qualitative estimate of production of the backward state.


Sign in / Sign up

Export Citation Format

Share Document