quantitative definition
Recently Published Documents


TOTAL DOCUMENTS

182
(FIVE YEARS 56)

H-INDEX

28
(FIVE YEARS 4)

InterConf ◽  
2021 ◽  
pp. 6-20
Author(s):  
Olena Ataieva

Scientific provisions on the social development of mankind in the third millennium in Ukraine and in the world are revealed. It is established that this direction is the result of the emergence in this period of a global continuous ecological, natural and socio-economic crisis of all mankind. In particular, the socio-economic crisis is manifested in the deterioration of socio-economic conditions of the vast majority of people and arise through the private-individual economic system of capitalism based on private ownership of means of production and class division of economic spheres. It is in this environment that the social contradictions between the two classes of owners for the means of production and the labor force, which can be reconciled in an evolutionary way under the influence of objective economic laws and the universe, mature. Such laws include such as the universal law of equilibrium, the law of human evolution, the social development of production relations in accordance with the level and quality of achievement of productive forces. Therefore, the progress of social relations is seen as a derivative of the development of productive forces, as a historical inevitability. The quantitative expression of the combination as the level of productive forces and production relations in the article considers the category of labor potential of society, in relation to which social development and change are determined, so the formula of its quantitative definition is revealed.


2021 ◽  
Author(s):  
Enni Sanmark ◽  
Lotta-Maria Oksanen ◽  
Noora Rantanen ◽  
Mari Lahelma ◽  
Veli-Jukka Anttila ◽  
...  

ABSTRACT Aim: The purpose of the study was to determine aerosol exposure generated by coughing in operation room environments to create a quantitative limit value for high risk aerosol generating medical procedures. Background: Coughing is known to produce a significant amount of aerosols and is thus commonly used as a best reference for high-risk aerosol-generation. Accordingly, procedures during which aerosol generation exceeds the amount of aerosol generated in instances of coughing are seen as high risk aerosol generating procedures. However, no reliable quantitative values are available for high risk aerosol-generation. Methods: Coughing was measured from 37 healthy volunteers in the operating room environment. Aerosol particles generated during coughing within the size range of 0.3 - 10 microm were measured with Optical Particle Sizer from 40cm, 70cm, and 100cm distances. The distances reflected potential exposure distances where personnel are during surgeries. Results: A total of 306 coughs were measured. Average aerosol concentration during coughing was 1.580 +/- 13.774 particles/cm3 (range 0.000 - 195.528). Discussion: The aerosol concentration measured in this study can be used as a limit for high-risk aerosol generation in the operating room environment when assessing the aerosol generating procedures and the risk of operating room staff s exposure for aerosol particles. AUTHOR APPROVAL:All authors have approved the manuscript and have made significant contributions.


2021 ◽  
Vol 11 ◽  
Author(s):  
Yang Han ◽  
Yi Xuan ◽  
Xiaowen Liu ◽  
Hui Zhu ◽  
Meng Zhang ◽  
...  

Gastric linitis plastica (GLP) is a descriptive term but lacks a quantitative definition. Several relatively quantitative criteria had been proposed, such as tumor involving a limit of one-third or two-thirds of the gastric surface. However, these criteria needed doctors to subjectively judge tumor infiltration area, which made diagnosis difficult to be objective and reproducible. This study aimed to propose a quantitative diagnostic criterion for distinguishing GLP. We performed a retrospective cohort study of 2,907 patients with Borrmann III and IV gastric cancer (GC) who underwent gastrectomy between 2011 and 2018 in our center. The Kaplan–Meier curves showed that patients with an observed tumor size more than 8 cm had obviously lower overall survival (OS) and disease-free survival (DFS) rates than those with a size less than 8 cm(p < 0.001; p < 0.001). However, there was no significantly different prognosis of patients with tumor sizes between more than 8 cm and more than 10 cm (p = 0.248; p = 0.534). Moreover, patients with tumor sizes greater than 8 cm more presented with advanced stage and had extremely poor 3-year OS and DFS (31.4%; 29.3%), with a stronger propensity toward peritoneal metastasis. Therefore, we considered patients’ observed tumor size more than 8 cm as a critical value for distinguishing the prognosis of Borrmann III and IV GC. Furthermore, we proposed an observed tumor size more than 8 cm as a quantitative diagnostic criterion for GLP on the premise of satisfying the originally descriptive and pathological definition regardless of Borrmann type.


2021 ◽  
Vol 27 ◽  
Author(s):  
Niklas Frahm ◽  
Michael Hecker ◽  
Uwe Zettl

: Polypharmacy is an important aspect of medication management and particularly affects elderly and chronically ill people. Patients with dementia, Parkinson’s disease (PD) or multiple sclerosis (MS) are at high risk for multimedication due to their complex symptomatology. Our aim was to provide an overview of different definitions of polypharmacy and to present the current state of research on polypharmacy in patients with dementia, PD or MS. The most common definition of polypharmacy in the literature is the concomitant use of ≥5 medications (quantitative definition approach). Polypharmacy rates of up to >50% have been reported for patients with dementia, PD or MS, although MS patients are on average significantly younger than those with dementia or PD. The main predictor of polypharmacy is the complex symptom profile of these neurological disorders. Potentially inappropriate medication (PIM), drug-drug interactions, poor treatment adherence, severe disease course, cognitive impairment, hospitalisation, poor quality of life, frailty and mortality have been associated with polypharmacy in patients with dementia, PD or MS. For patients with polypharmacy, either the avoidance of PIM (selective deprescribing) or the substitution of PIM with more suitable drugs (appropriate polypharmacy) is recommended to achieve a more effective therapeutic management.


2021 ◽  
Vol 13 (13) ◽  
pp. 7222
Author(s):  
Juan García-Díez ◽  
Carla Gonçalves ◽  
Luca Grispoldi ◽  
Beniamino Cenci-Goga ◽  
Cristina Saraiva

Food security, as part as public health protection, constitutes one of the main objectives for countries aiming to ensure the health of all their citizens. However, food security is compromised worldwide by conflict, political instability, or economic crises, both in developed and developing countries. Conversely, because of the importance of agriculture to the economies of rural areas both in developed and developing countries, this sector can contribute to improving food stability, as well as to furthering food security. Thus, livestock and traditional meat products represent a key factor in ensuring food availability. Overall, biosecurity measures improve animal welfare by decreasing the occurrence of diseases that compromise the stability by causing fluctuations in the availability of meat and animal-derived food products such as milk, eggs, or traditional fermented products. As a consequence, an absence of biosecurity measures affects food security (in its quantitative definition, as described above) as well as the productive, sanitary, and environmental sustainability of the rural environment. Products of animal origin support local trade and the regional economy, while contributing to the availability of foods without great external dependence. The manufacture of foods of animal origin aims to create products that are durable and that maintain food availability for long periods of time, even during seasons with scarce resources. Thus, dry-cured or fermented meat products play an important role in food availability. Food security also refers to food access under healthy economic conditions; therefore, knowledge of the main tools that guarantee the safety of these kinds of food products is essential to achieving food stability and further food security.


2021 ◽  
Vol 9 ◽  
Author(s):  
Ted Sichelman

Many scholars have employed the term “entropy” in the context of law and legal systems to roughly refer to the amount of “uncertainty” present in a given law, doctrine, or legal system. Just a few of these scholars have attempted to formulate a quantitative definition of legal entropy, and none have provided a precise formula usable across a variety of legal contexts. Here, relying upon Claude Shannon's definition of entropy in the context of information theory, I provide a quantitative formalization of entropy in delineating, interpreting, and applying the law. In addition to offering a precise quantification of uncertainty and the information content of the law, the approach offered here provides other benefits. For example, it offers a more comprehensive account of the uses and limits of “modularity” in the law—namely, using the terminology of Henry Smith, the use of legal “boundaries” (be they spatial or intangible) that “economize on information costs” by “hiding” classes of information “behind” those boundaries. In general, much of the “work” performed by the legal system is to reduce legal entropy by delineating, interpreting, and applying the law, a process that can in principle be quantified.


2021 ◽  
Author(s):  
Nitin Kumar ◽  
Mayank Kapoor ◽  
Prasan Kumar Panda ◽  
Yogesh Singh ◽  
Ajeet Singh Bhadoria

Background The age-old definition of fever was derived using cross-sectional population surveying utilizing old techniques without considering symptomatology. However, the diagnosis of fever must be made only in the presence of associated symptoms that can distinguish it from the mere asymptomatic physiologic rise of temperature. Association of the temperature values with the symptoms to define the cut-off for fever is need of the hour. Methods A longitudinal study on the healthy population of Northen-India were followed-up over 1-year. Participants were advised for self-monitoring of oral temperature with a standard digital thermometer in either left or right sublingual pocket and record it in the thermometry diary. The study was considered complete if the participant had all the three phases of the study (i.e. non-febrile, febrile, and post-febrile phases) or completed the duration of the study. Results Per protocol analysis done for febrile participants (n=144, temperature recordings= 23851). The mean febrile phase temperature was 100.25 ± 1.440F. A temperature of 99.10F had maximum diagnostic accuracy for feeling feverish (98.2%), along with one (98.3%) or two (99%) associated symptoms. Summer and spring months showed higher temperatures (100.38 ± 1.44 v/s 99.80 ± 1.49, P<0.001), whereas no significant temperature difference could be noted amongst the gender. Conclusions A revised cut-off for the temperature to decide fever is hereby proposed: 99.10F along with one or two associated symptoms. This is going to redefine fever in the modern era completely.


2021 ◽  
Vol 22 (Supplement_3) ◽  
Author(s):  
WL Duvall ◽  
C Godoy Rivas ◽  
M Elsadany ◽  
M Hobocan ◽  
S Mcmahon

Abstract Funding Acknowledgements Type of funding sources: None. Background/Introduction:  Bone scintigraphy with 99m-Technecium-Pyrophosphate (99m-Tc-PYP) with planar and SPECT imaging is now commonplace for the non-invasive diagnosis of ATTR cardiac amyloidosis. However, the quantification of 99m-Tc-PYP uptake is based on a semi-quantitative visual score and a heart to contralateral lung ratio which suffer from poor reproducibility. A more robust method of quantifying uptake and reporting results would be beneficial and may be possible using volumetric assessment with fused SPECT/CT acquisition. Purpose   The aim of this study was to evaluate the performance of a novel semi-automated quantitative software to diagnose ATTR cardiac amyloidosis in patients with a clinical suspicion of cardiac amyloidosis who underwent 99m-Tc-PYP SPECT/CT imaging. Methods This was a retrospective, single-center study of consecutive patients who underwent 99m-Tc-PYP SPECT/CT imaging from September to December 2020. Quantification software was used to obtain standardized uptake values (SUVs) of 99m-Tc-PYP activity in the whole heart using SPECT/CT data. The total SUVs, mean SUVs, and percentage of injected tracer dose in the heart, as well as two other sets of these measurements adjusted for residual blood pool activity were obtained. Activity in the lung and bone was used to calculate heart to bone and heart to right lung ratios. The results from the software quantification were compared to the results from planar imaging as well as to the final clinical diagnosis of amyloidosis. Results   A total of 59 patients were imaged during this time with an average age of 74.1 ± 11.8, and 32 (54.2%) were male. After excluding 8 patients for technical issues, 12 patients were found to be positive for amyloid, 39 were negative, and the average imaging delay time was 75.0 ± 15.2 minutes. 13 methods of assessment were evaluated with the metric of the percentage of injected tracer dose found in the heart that was adjusted for the mean residual blood pool activity having the best discrimination between abnormal and normal studies. The mean percentage of injected dose in positive patients was 2.87% vs 0.98% in the patients without amyloidosis (p &lt; 0.0001). Using a cutoff of 2% to ensure that no patients with amyloid would be missed by screening, there was 100% sensitivity, 94.9% specificity, and 96.1% accuracy. There was a significant difference in the percentage injected dose based on gradations of planar heart to contralateral lung ratio and planar visual score. Conclusion Volumetric software quantification may be a superior method of evaluating 99m-Tc-PYP cardiac amyloidosis studies. This methodology may allow for a quantitative definition of a normal or abnormal 99m-Tc-PYP cardiac amyloid study and provide for the potential of following response to therapy.


2021 ◽  
Author(s):  
Stephanie Tickle

The Hikurangi margin is one of the largest sources of seismic and tsunami hazards in New Zealand, but there is still much that remains unknown about previous ruptures on the subduction interface. Turbidite paleoseismology has the potential to increase the spatial density and temporal extent of paleoearthquake records. However, it is heavily reliant on temporal correlation of turbidites, and thus, requires them to be precisely dated. Typically, ages are obtained using radiocarbon dating of pelagic foraminifera from background sediments deposited between turbidites. This dating method requires background sedimentation to be accurately distinguished from the fine-grained tails of turbidites. Along the southern and central Hikurangi Margin, background sedimentation and turbidite tails have proven difficult to distinguish from one another. Here, a quantitative approach is developed to distinguish turbidite tails and background sediments using machine learning. <br><br>This study utilizes a natural experiment generated by the M W 7.8 Kaikōura earthquake, which caused the deposition of co-seismic turbidites at locations both proximal and distal to active canyon systems. The 2016 turbidite could be recognised due to its stratigraphic position at core tops. Turbidites and background sediments were independently identified using 210 Pb activity profiles to identify gradual accumulation. Additionally, foraminiferal assemblages were used to identify transported material. The physical and geochemical properties of the sediments were then analysed using non-destructive (computed tomography density, magnetic susceptibility, micro X-ray fluorescence derived geochemistry) and destructive (grain size, carbonate content, organic content) techniques, to develop a quantitative definition of turbidite tails and background sediments. The destructive datasets were then compared to the non-destructive data that acts as a proxy for these analyses because the latter are rapidly generated at high resolutions down core and are now routinely acquired in most turbidite paleoseismology studies. It was determined that there was a statistically significant correlation between the destructive data and the non-destructive proxies, such that the non-destructive data could be used as a viable alternative to the time consuming destructive analyses. <br><br>The machine learning technique, Linear Discriminant Analysis (LDA), successfully distinguishes background sediment and turbidite tails in areas where they are visually indistinguishable. The LDA model shows that in cores distal from active canyon systems, background sediment and turbidite tails are more distinct than in cores proximal to active canyon systems. Difference between canyon-proximal and distal sites may be due to the impact of weak bottom currents that are inferred to be acting on the background sedimentation processes along this margin. This study shows that quantitative identification of background sediments and turbidite tails is possible, and could allow for more robust identification and dating of turbidites globally, which is of paramount importance for the effective application of turbidite paleoseismology.<br>


2021 ◽  
Author(s):  
Stephanie Tickle

The Hikurangi margin is one of the largest sources of seismic and tsunami hazards in New Zealand, but there is still much that remains unknown about previous ruptures on the subduction interface. Turbidite paleoseismology has the potential to increase the spatial density and temporal extent of paleoearthquake records. However, it is heavily reliant on temporal correlation of turbidites, and thus, requires them to be precisely dated. Typically, ages are obtained using radiocarbon dating of pelagic foraminifera from background sediments deposited between turbidites. This dating method requires background sedimentation to be accurately distinguished from the fine-grained tails of turbidites. Along the southern and central Hikurangi Margin, background sedimentation and turbidite tails have proven difficult to distinguish from one another. Here, a quantitative approach is developed to distinguish turbidite tails and background sediments using machine learning. <br><br>This study utilizes a natural experiment generated by the M W 7.8 Kaikōura earthquake, which caused the deposition of co-seismic turbidites at locations both proximal and distal to active canyon systems. The 2016 turbidite could be recognised due to its stratigraphic position at core tops. Turbidites and background sediments were independently identified using 210 Pb activity profiles to identify gradual accumulation. Additionally, foraminiferal assemblages were used to identify transported material. The physical and geochemical properties of the sediments were then analysed using non-destructive (computed tomography density, magnetic susceptibility, micro X-ray fluorescence derived geochemistry) and destructive (grain size, carbonate content, organic content) techniques, to develop a quantitative definition of turbidite tails and background sediments. The destructive datasets were then compared to the non-destructive data that acts as a proxy for these analyses because the latter are rapidly generated at high resolutions down core and are now routinely acquired in most turbidite paleoseismology studies. It was determined that there was a statistically significant correlation between the destructive data and the non-destructive proxies, such that the non-destructive data could be used as a viable alternative to the time consuming destructive analyses. <br><br>The machine learning technique, Linear Discriminant Analysis (LDA), successfully distinguishes background sediment and turbidite tails in areas where they are visually indistinguishable. The LDA model shows that in cores distal from active canyon systems, background sediment and turbidite tails are more distinct than in cores proximal to active canyon systems. Difference between canyon-proximal and distal sites may be due to the impact of weak bottom currents that are inferred to be acting on the background sedimentation processes along this margin. This study shows that quantitative identification of background sediments and turbidite tails is possible, and could allow for more robust identification and dating of turbidites globally, which is of paramount importance for the effective application of turbidite paleoseismology.<br>


Sign in / Sign up

Export Citation Format

Share Document