quantitative indication
Recently Published Documents


TOTAL DOCUMENTS

26
(FIVE YEARS 4)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Vol 7 (12) ◽  
pp. 1008
Author(s):  
Marco Cartabia ◽  
Carolina Elena Girometta ◽  
Chiara Milanese ◽  
Rebecca Michela Baiguera ◽  
Simone Buratti ◽  
...  

Wood decay fungi (WDF) seem to be particularly suitable for developing myco-materials due to their mycelial texture, ease of cultivation, and lack of sporification. This study focused on a collection of WDF strains that were later used to develop mycelium mats of leather-like materials. Twenty-one WDF strains were chosen based on the color, homogeneity, and consistency of the mycelia. The growth rate of each strain was measured. To improve the consistency and thickness of the mats, an exclusive method (newly patented) was developed. The obtained materials and the corresponding pure mycelia grown in liquid culture were analyzed by both thermogravimetric analysis (TGA) and scanning electron microscopy (SEM) to evaluate the principal components and texture. TGA provided a semi-quantitative indication on the mycelia and mat composition, but it was hardly able to discriminate differences in the production process (liquid culture versus patented method). SEM provided keen insight on the mycelial microstructure as well as that of the mat without considering the composition; however, it was able to determine the hyphae and porosity dimensions. Although not exhaustive, TGA and SEM are complementary methods that can be used to characterize fungal strains based on their desirable features for various applications in bio-based materials. Taking all of the results into account, the Fomitopsis iberica strain seems to be the most suitable for the development of leather-like materials.


Author(s):  
Dmitro Ganzha ◽  
Dmytro Ganzha ◽  
Borys Sploshnoy

The aim is to improve the beta-radiometric method of quantitative indication of the content of 90Sr and 137Cs in the counters of plant samples. Material and methods. In the Chernobyl exclusion zone (ChEZ) in 2017, 2019, leaves of silver birch, black poplar, common reed, sedge were selected, which were dried, crushed, and used as calculating samples for beta radiometry and spectrometry. For measurements, a combined KRK-1 radiometer and a SEB 01-150 spectrometer beta-radiation energy were used. Results. Currently, in plant samples from the ChEZ, the following are widespread: natural 40K, the concentration of which is usually less than 1 % in relation to the concentration of technogenic radionuclides 90Sr+90Y and 137Cs, therefore, when measuring 90Sr and 137Cs, beta radiation of 40K can be ignored. The measurements were carried out without a spectral filter and using a thin molybdenum filter. Without filter – show the count rate of 90Sr+90Y and 137Cs radiation. The filter transmits 2–3.5 % of the low-energy beta radiation of 90Sr and 137Cs and more than 95 % of the high-energy 90Y. The ratio of the count rate of 90Y pulses with and without filter is 2.14. The 90Sr concentration in the samples was determined from the results of measurements of 90Y, and 137Cs – through the fraction of the counting rate, which remains after deducting 90Sr+90Y. Comparison of the concentration of radionuclides measured by the method of beta-radiometry and spectrometry showed no significant difference between the results obtained by the two methods. Conclusions. The beta radiometry method for 90Sr and 137Cs provides for measuring the counting rate of beta radiation from counting samples without a spectral filter and using a thin molybdenum filter. Based on the research results, a procedure for calculating the concentration of 90Sr and 137Cs in counting samples of plant leaves was developed


2020 ◽  
Vol 55 (9) ◽  
pp. 885-892
Author(s):  
Franco M. Impellizzeri ◽  
Paolo Menaspà ◽  
Aaron J. Coutts ◽  
Judd Kalkhoven ◽  
Miranda J Menaspà

The purpose of this 2-part commentary series is† to explain why we believe our ability to control injury risk by manipulating training load (TL) in its current state is an illusion and why the foundations of this illusion are weak and unreliable. In part 1, we introduce the training process framework and contextualize the role of TL monitoring in the injury-prevention paradigm. In part 2, we describe the conceptual and methodologic pitfalls of previous authors who associated TL and injury in ways that limited their suitability for the derivation of practical recommendations. The first important step in the training process is developing the training program: the practitioner develops a strategy based on available evidence, professional knowledge, and experience. For decades, exercise strategies have been based on the fundamental training principles of overload and progression. Training-load monitoring allows the practitioner to determine whether athletes have completed training as planned and how they have coped with the physical stress. Training load and its associated metrics cannot provide a quantitative indication of whether particular load progressions will increase or decrease the injury risk, given the nature of previous studies (descriptive and at best predictive) and their methodologic weaknesses. The overreliance on TL has moved the attention away from the multifactorial nature of injury and the roles of other important contextual factors. We argue that no evidence supports the quantitative use of TL data to manipulate future training with the purpose of preventing injury. Therefore, determining “how much is too much” and how to properly manipulate and progress TL are currently subjective decisions based on generic training principles and our experience of adjusting training according to an individual athlete's response. Our message to practitioners is to stop seeking overly simplistic solutions to complex problems and instead embrace the risks and uncertainty inherent in the training process and injury prevention.


Water ◽  
2019 ◽  
Vol 11 (3) ◽  
pp. 517 ◽  
Author(s):  
Karin M. de Bruijn ◽  
Carolina Maran ◽  
Mike Zygnerski ◽  
Jennifer Jurado ◽  
Andreas Burzel ◽  
...  

In order to increase the flood resilience of cities (i.e., the ability to cope with flood hazards), it is also crucial to make critical infrastructure functions resilient, since these are essential for urban society. Cities are complex systems with many actors of different disciplines and many interdependent critical infrastructure networks and functions. Common flood risk analysis techniques provide useful information but are not sufficient to obtain a complete overview of the effects of flooding and potential measures to increase flood resilience related to critical infrastructure networks. Therefore, a more comprehensive approach is needed which helps accessing knowledge of actors in a structured way. Fort Lauderdale, Florida, United States has suffered from flood impacts, especially from disruptions in critical infrastructure. This paper shows how shared insight among different sectors and stakeholders into critical infrastructure resilience and potential resilience-enhancing measures was obtained using input from these actors. It also provides a first quantitative indication of resilience, indicated by the potential disruption due to floods and the effect of measures on resilience. The paper contributes to the existing literature on resilience specifically by considering the duration of disruption, the inclusion of critical infrastructure disruption in flood impact analysis, and the step from resilience quantification to measures.


2011 ◽  
Vol 354-355 ◽  
pp. 1149-1156 ◽  
Author(s):  
Li Yun Ma ◽  
Quan Sheng Shi

Since there are lots of qualitative indexes on safety assessment of the electric power supply company, the issue becomes exigent as how to transform qualitative descriptions to quantitative indication to make the safety assessment more scientific and accurate. Cloud theory which is based on traditional fuzzy set theory and probability theory consider the fuzziness and randomness, providing a powerful method for qualitative and quantitative information combining. This paper introduces Cloud gravity center evaluation in Cloud theory to safety assessment of the electric power supply company, which renders the reasonable transformation from qualitative concepts to quantitative descriptions. The case analysis shows the method is verified and scientific.


2010 ◽  
Vol 29 (2) ◽  
pp. 139
Author(s):  
Brij Bhushan Tewari

Quantitative indication of the process of forming a complex comes from the evaluation of the stability constants, which characterize the equilibria corresponding to the successive addition of ligands. A method, involving the use of paper electrophoretic technique is described for the study of binary complex system in solution. Present method is based upon the migration of a spot of metal ions on a paper strip at different pH’s of background electrolyte. A graph of pH against mobility gives information about the binary complexes and permits calculation of their stability constants. The first and second stability constants of [Be(II)-homoserine] and [Co(II)-homoserine] complexes were found to be (7.13 ± 0.02; 6.11 ± 0.09) and (4.27 ± 0.07; 3.47 ± 0.11) (logarithm stability constant values) for Be(II) and Co(II) complexes, respectively, at ionic strength of 0.1 mol/L and a temperature of 35 °C.


Author(s):  
H Lobato ◽  
C Ferri ◽  
J Faraway ◽  
N Orchard

Any measurement method of a physical quantity cannot provide an exact unequivocal result owing to the infinite amount of information necessary to characterize fully both the physical quantity to be measured and the measuring process. Therefore, a quantitative indication of the quality of a measurement result needs to be given to enable its reliable use. Uncertainty is one such indication. Provision of incorrect uncertainty statements for measurements performed by a co-ordinate measuring machine (CMM) may lead to very serious economic implications. In this study, the uncertainty of CMM measurements is estimated by a single parameter accounting for both systematic and random errors. The effects that environmental conditions (temperature), discretionary set-up parameters (probe extension, stylus length), and measuring plan decisions (number of points) have on uncertainty of measurements are then investigated. Interactions between such factors are also shown to be significant.


2006 ◽  
Vol 12 (3) ◽  
pp. 435-599 ◽  
Author(s):  
A. R. Jones ◽  
P. J. Copeman ◽  
E. R. Gibson ◽  
N. J. S. Line ◽  
J. A. Lowe ◽  
...  

ABSTRACTReserving is important to our profession as it is a core activity for actuaries. The members of the General Insurance Reserving Issues Taskforce (GRIT) have been considering how actuaries can improve the way in which we do reserving in general insurance. We gathered our thoughts and recommendations together in a Consultation Paper which has been discussed widely in the profession. We are very grateful to everyone who shared their views and comments with us, particularly those who gave us written feedback. We have considered carefully all the feedback which we received and adapted our final report in response to this.Given the scope and importance of our remit, it is perhaps not surprising that this is not a short paper. We hope that Section 1 provides a reasonable summary.Generally, our view is that there are many things on which our profession should focus. However, it is also important to remind ourselves of the positive items of feedback which we heard from our stakeholders. In addition to many suggestions for things to do better, we consistently heard the message that actuaries play an extremely valuable role in general insurance.This is a major testimony to the progress which the actuarial profession has made in recent years in its ability to contribute to the general insurance industry. Perhaps it is because of this progress that now is an appropriate time for us, as a profession, to take a hard look at what we do in reserving, and ask ourselves whether there are any things which we could do differently. We hope that GRIT's report will facilitate this debate.GRIT's recommendations fall under the following key themes:— Providing more transparency to our reserving methods and helping our stakeholders have more insight into the key reserving assumptions and decisions.— Providing more information on uncertainty in our reserve estimates. In particular, we recommend that actuaries provide a quantitative indication of the range of outcomes for future claim payments, and that our profession defines a common vocabulary for communicating uncertainty.— Understanding better the business we are reserving. We suggest a range of analyses and activities for doing this.— Applying our standard actuarial reserving methods more consistently. We identify a list of specific areas where we believe that there is scope for improvement. Also, we believe that the actuarial training syllabus should be extended, and this leads to consideration of whether a more specialised general insurance actuarial qualification is needed.— Understanding the implications of the underwriting cycle, which, we believe, influences the behaviour of claims development in a way that our reserving models do not currently capture. We suggest what we believe may be the foundations of a potentially more cycle proof methodology, but this is an area which we believe will require much more research.— Helping actuaries understand how behaviour can affect the reserve estimation process, particularly in the face of uncertainty. We make various suggestions in this area, including helping actuaries manage pressure from third parties.We are convinced that, for our profession to implement these suggestions, it will require a concerted change management strategy and set of actions to embed changes into the way in which actuaries work. We believe that this will include:— increasing the level of debate and research in the profession on claims reserving;— a broader communication programme with the general insurance industry, covering, amongst other things, uncertainty and data quality;— a sub-group of the GI Board with a specific focus on reserving, responsible for implementing GRIT's recommendations and dealing with new issues as they arise; and— our profession resolving the conflicting pressures which will arise out of the extra work required for reserving by the GRIT recommendations.There is one specific item where we have not made a recommendation. It has been suggested to us that many of the standard reserving methods in common use, such as the chain ladder, are not sophisticated, and that more sophisticated mathematical and statistical methods should be a priority. We do not agree with this. Whereas, in the longer term, this might be an important issue for our profession, we believe that the current focus for actuaries should be in the areas set out in this paper, such as understanding the business better.GRIT believes that the issues which we have identified are important for the future of our profession and the contribution which we can make to the general insurance industry.


Author(s):  
HENRY SELVARAJ ◽  
S. THAMARAI SELVI ◽  
D. SELVATHI ◽  
R. RAMKUMAR

This paper proposes an intelligent classification technique to identify two categories of MRI volume data as normal and abnormal. The manual interpretation of MRI slices based on visual examination by radiologist/physician may lead to incorrect diagnosis when a large number of MRIs are analyzed. In this work, the textural features are extracted from the MR data of patients and these features are used to classify a patient as belonging to normal (healthy brain) or abnormal (tumor brain). The categorization is obtained using various classifiers such as support vector machine (SVM), radial basis function, multilayer perceptron and k-nearest neighbor. The performance of these classifiers are analyzed and a quantitative indication of how better the SVM performance is when compared with other classifiers is presented. In intelligent computer aided health care system, the proposed classification system using SVM classifier can be used to assist the physician for accurate diagnosis.


Sign in / Sign up

Export Citation Format

Share Document