Multiscale unified prediction of size/scale and Hall-Petch effects in the mechanics of polycrystalline materials

2013 ◽  
Vol 22 (1-2) ◽  
pp. 67-71 ◽  
Author(s):  
George N. Frantziskonis

AbstractMaterials show size effects in their strength, i.e., improved strength as size decreases. Size effects have been studied extensively at a wide range of scales, from atomistic to continuum. Size effects depend on the scale of reference, as the physics change with increasing or decreasing scale. The work reported herein concentrates at scales near the average grain size in polycrystalline solids, where they are examined in conjunction with Hall-Petch effects. It presents a process for isolating physical information on a problem at specific spatial or temporal scales and applies it to Hall-Petch and size effects in one spatial dimension, extendable to higher dimensions. Importantly, the scale-isolated information captures the interactions among scales. As material failure and Hall-Petch effects are highly stochastic, a probabilistic approach to the present work is more appropriate than a deterministic one.

Computers ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 82
Author(s):  
Ahmad O. Aseeri

Deep Learning-based methods have emerged to be one of the most effective and practical solutions in a wide range of medical problems, including the diagnosis of cardiac arrhythmias. A critical step to a precocious diagnosis in many heart dysfunctions diseases starts with the accurate detection and classification of cardiac arrhythmias, which can be achieved via electrocardiograms (ECGs). Motivated by the desire to enhance conventional clinical methods in diagnosing cardiac arrhythmias, we introduce an uncertainty-aware deep learning-based predictive model design for accurate large-scale classification of cardiac arrhythmias successfully trained and evaluated using three benchmark medical datasets. In addition, considering that the quantification of uncertainty estimates is vital for clinical decision-making, our method incorporates a probabilistic approach to capture the model’s uncertainty using a Bayesian-based approximation method without introducing additional parameters or significant changes to the network’s architecture. Although many arrhythmias classification solutions with various ECG feature engineering techniques have been reported in the literature, the introduced AI-based probabilistic-enabled method in this paper outperforms the results of existing methods in outstanding multiclass classification results that manifest F1 scores of 98.62% and 96.73% with (MIT-BIH) dataset of 20 annotations, and 99.23% and 96.94% with (INCART) dataset of eight annotations, and 97.25% and 96.73% with (BIDMC) dataset of six annotations, for the deep ensemble and probabilistic mode, respectively. We demonstrate our method’s high-performing and statistical reliability results in numerical experiments on the language modeling using the gating mechanism of Recurrent Neural Networks.


Author(s):  
S.I. Spiridonov ◽  
◽  
V.V. Ivanov ◽  
I.E. Titov ◽  
V.E. Nushtaeva ◽  
...  

This paper presents a radioecological assessment of forage agricultural land in the southwestern districts of the Bryansk region based on data characterizing the variability of the radionuclides content in the soil. Concentration of 137Cs in forage was calculated taking into account the proba-bility distributions of 137Cs soil contamination density and the soil to plant transfer factor. The pro-cessing data of the radioecological survey has shown the soil contamination density with 137Cs of agricultural lands in the southwestern areas of the Bryansk region obeys a lognormal law. The authors have used statistical models and software modules for the radioecological assessment of forage lands. Risks of exceeding the 137Cs content standards in forage obtained on soils with different texture have been calculated. The limiting levels of contamination of pastures and hay-fields with 137Cs, ensuring compliance with the specified risks for forage, have been estimated. The lowest limiting soil contamination density is characteristic of organic soils, which can be con-sidered “critical” from the point of view of 137Cs intake into forage. The authors have predicted the time of remediation of forage lands in the southwestern districts of the Bryansk region in the ab-sence of protective measures based on a probabilistic approach. The time period during which the risk of forage contamination for sandy, sandy loam and clay loam soils will decrease to 10% varies for the areas under consideration in a wide range, not exceeding 64 years. It is concluded that it is advisable to substantiate the value of the acceptable risk of forage contamination, taking into account radiological and socio-economic aspects.


Author(s):  
Fonna Forman ◽  
Veerabhadran Ramanathan

With unchecked emissions of pollutants, global warming is projected to increase to 1.50C within 15 years; to 20C within 35 years and 40C by 2100. These projections are central values with a small (<5%) probability that warming by 2100 can exceed 60C with potentially catastrophic impacts on every human being, living and yet unborn. Climate is already changing in perceptible ways through floods, droughts, wildfires, heat waves and sea level rise, displacing communities and catalyzing migration. Climate migration describes the voluntary and forced movement of people within and across habitats due to changes in climate. While estimates vary from 25 million to as many as one billion climate change migrants by 2050, achieving reliable quantitative estimates of future climate migration faces forbidding obstacles due to: 1) a wide range of projected warming due to uncertainties in climate feedbacks; 2) the lack of a settled definition for climate migration; and 3) the causal complexity of migration due to variability in non-environmental factors such as bioregion, culture, economics, politics and individual factors. But waiting for reliable estimates this creates unacceptable ethical risks. Therefore, we advocate a probabilistic approach to climate migration that accounts for both central and low probability warming projections as the only ethical response to the unfolding crisis. We conclude that in the absence of drastic mitigation actions, climate change-induced mass migration can become a major threat during the latter half of this century.


2019 ◽  
Author(s):  
Gabriela Aznar-Siguan ◽  
David N. Bresch

Abstract. The need for assessing the risk of extreme weather events is ever increasing. In addition to quantification of risk today, the role of aggravating factors such as high population growth and changing climate conditions do matter, too. We present the open source software CLIMADA, which integrates hazard, exposure and vulnerability to compute the necessary metrics to assess risk and to quantify socio-economic impact. The software design is modular and object-oriented, offering a simple collaborative framework and a parallelization strategy which allows for scalable computations on clusters. CLIMADA supports multi-hazard calculations and provides an event-based probabilistic approach that is globally consistent for a wide range of resolutions, suitable for whole-country to detailed local studies. This paper uses the platform to estimate and contextualize the damage of hurricane Irma in the Caribbean in 2017. Most of the affected islands are non-sovereign countries and do also rely on overseas support in case disaster strikes. The risk assessment performed for this region, based on remotely available data available shortly before or hours after landfall of Irma, proves to be close to reported damage and hence demonstrates a method to provide readily available impact estimates and associated uncertainties in real time.


2007 ◽  
Vol 35 (4) ◽  
pp. 336-360 ◽  
Author(s):  
G. Dimitriadis ◽  
G. A. Vio

The identification of nonlinear dynamic systems is increasingly becoming a necessary part of vibration testing and there is significant research effort devoted to it. However, as the current methodologies are still not suitable for the identification of general nonlinear systems, the subject is rarely introduced to undergraduate students. In this paper, recent progress in developing an expert approach to the identification of nonlinear systems is used in order to demonstrate the subject within the context of an undergraduate course or as an introductory tool for postgraduate students. The demonstration is based around a software package of an expert system designed to apply systematically a wide range of identification approaches to the system under investigation. It is shown that the software can be used to demonstrate the need for identification of nonlinear systems, the complexity of the procedure, the possibility of failure and the good chances of success when enough physical information about the system is available.


Author(s):  
S. J. Lewis ◽  
C. E. Truman ◽  
D. J. Smith

So-called ‘local approach’ methods for fracture analyses, such as the commonly used Beremin model, are attractive as a means to predict component failure due to their flexibility and applicability to a wide range of geometries. However, in cases where cyclic loading occurs, resulting in the accumulation of plastic strain and accompanying residual stress, the validity of the Beremin approach is questionable. This work investigates the applicability of a range of alternative local approach methods to model material failure behaviour in such cases, as well as commenting on the calibration and physical basis of such methods.


2015 ◽  
Vol 8 (10) ◽  
pp. 4155-4170 ◽  
Author(s):  
L. Klüser ◽  
N. Killius ◽  
G. Gesell

Abstract. The cloud processing scheme APOLLO (AVHRR Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. It builds upon the physical principles that have served well in the original APOLLO scheme. Nevertheless, a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is no longer performed as a binary yes/no decision based on these physical principles. It is rather expressed as cloud probability for each satellite pixel. Consequently, the outcome of the algorithm can be tuned from being sure to reliably identify clear pixels to conditions of reliably identifying definitely cloudy pixels, depending on the purpose. The probabilistic approach allows retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for application to large amounts of historical satellite data. The radiative transfer solution is approximated by the same two-stream approach which also had been used for the original APOLLO. This allows the algorithm to be applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e., within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from NOAA-18 are presented.


2003 ◽  
Vol 36 (4) ◽  
pp. 1040-1049 ◽  
Author(s):  
H.-R. Wenk ◽  
S. Grigull

The wide availability of X-ray area detectors provides an opportunity for using synchrotron radiation based X-ray diffraction for the determination of preferred crystallite orientation in polycrystalline materials. These measurements are very fast compared to other techniques. Texture is immediately recognized as intensity variations along Debye rings in diffraction images, yet in many cases this information is not used because the quantitative treatment of texture information has not yet been developed into a standard technique. In special cases it is possible to interpret the texture information contained in these intensity variations intuitively. However, diffraction studies focused on the effects of texture on materials properties often require the full orientation distribution function (ODF) which can be obtained from spherical tomography analysis. In cases of high crystal symmetry (cubic and hexagonal) an approximation to the full ODF can be reconstructed from single diffraction images, as is demonstrated for textures in rolled copper and titanium sheets. Combined with area detectors, the reconstruction methods make the measurements fast enough to study orientation changes during phase transformations, recrystallization and deformationin situ, and even in real time, at a wide range of temperature and pressure conditions. The present work focuses on practical aspects of texture measurement and data processing procedures to make the latter available for the growing community of synchrotron users. It reviews previous applications and highlights some opportunities for synchrotron texture analysis based on case studies on different materials.


2014 ◽  
Vol 1004-1005 ◽  
pp. 158-162 ◽  
Author(s):  
Xiang Ting Hong ◽  
Fu Chen ◽  
Fei Chen ◽  
Wang Yu ◽  
Bo Rong Sang ◽  
...  

Microstructures of metal micro parts after microforming at elevated temperatures must be evaluated due to mechanical properties depend on average grain size. In this work, the effects of specimen diameter on the microstructure and microhardness of a hot-extruded AZ31B magnesium alloy were studied. Obvious size effect on microstructure and microhardness of the alloy could be observed. The size effects could be explained by strain distribution and dislocation density differences between the two kinds of specimens.


Sign in / Sign up

Export Citation Format

Share Document