Probabilistic multiscale modeling of fracture in heterogeneous materials and structures

2020 ◽  
Vol 86 (7) ◽  
pp. 45-54
Author(s):  
A. M. Lepikhin ◽  
N. A. Makhutov ◽  
Yu. I. Shokin

The probabilistic aspects of multiscale modeling of the fracture of heterogeneous structures are considered. An approach combining homogenization methods with phenomenological and numerical models of fracture mechanics is proposed to solve the problems of assessing the probabilities of destruction of structurally heterogeneous materials. A model of a generalized heterogeneous structure consisting of heterogeneous materials and regions of different scales containing cracks and crack-like defects is formulated. Linking of scales is carried out using kinematic conditions and multiscale principle of virtual forces. The probability of destruction is formulated as the conditional probability of successive nested fracture events of different scales. Cracks and crack-like defects are considered the main sources of fracture. The distribution of defects is represented in the form of Poisson ensembles. Critical stresses at the tops of cracks are described by the Weibull model. Analytical expressions for the fracture probabilities of multiscale heterogeneous structures with multilevel limit states are obtained. An approach based on a modified Monte Carlo method of statistical modeling is proposed to assess the fracture probabilities taking into account the real morphology of heterogeneous structures. A feature of the proposed method is the use of a three-level fracture scheme with numerical solution of the problems at the micro, meso and macro scales. The main variables are generalized forces of the crack propagation and crack growth resistance. Crack sizes are considered generalized coordinates. To reduce the dimensionality, the problem of fracture mechanics is reformulated into the problem of stability of a heterogeneous structure under load with variations of generalized coordinates and analysis of the virtual work of generalized forces. Expressions for estimating the fracture probabilities using a modified Monte Carlo method for multiscale heterogeneous structures are obtained. The prospects of using the developed approaches to assess the fracture probabilities and address the problems of risk analysis of heterogeneous structures are shown.

2012 ◽  
Vol 479-481 ◽  
pp. 2001-2004
Author(s):  
Zhi Yong Zhang ◽  
Tian Shu Song ◽  
Yang He

A new method is presented in the paper. The fatigue life reliability of submarine cone-cylinder shell is investigated, based on the combination between the methods of conventional Monte Carlo and classical probabilistic fracture mechanics. Firstly, Monte Carlo method is employed to obtain the reliability of given initial fatigue life. Secondly, the two induced factors M1 and M2 in the paper are estimated according to the initial fatigue life and the reliability. Thirdly, based on the two factors, the other fatigue life reliability is obtained by using classical probabilistic fracture mechanics method. Finally, numerical cases show that the proposed method is more efficient without accuracy loss for fatigue life reliability compared with Monte Carlo method. This method can also be applied to predict the fatigue life reliability of analogue structures.


Author(s):  
Zhuangzhuang He ◽  
Lijun Li ◽  
Taikun Wang ◽  
Yantao Wang ◽  
Xudong Yang ◽  
...  

It is reported that carbon nanotube (CNT)-based conductive polymer composites have potential application prospect in structural health monitoring and flexible sensors. However, the current price of CNTs is relatively high compared with other fillers. To reduce the materials cost and ensure the sensing characteristics of this type of materials, the most economic and least amount of CNTs needed should be found, this balance value is called as electrical percolation threshold (EPT) in this study. First, a large number of numerical models containing CNTs with three-dimensional random distribution and epoxy resin matrix are established by Monte Carlo method. Then, the construct of conductive network is observed using these models, and the influence of electron tunneling between two adjacent CNTs on the EPT is investigated. Furthermore, the influence of length-diameter ratio (L/D) of CNTs, length variation and angle distribution of CNTs on EPT is investigated. This research provides useful information on how to produce conductive composites more economically.


2006 ◽  
Vol 306-308 ◽  
pp. 917-922
Author(s):  
Akiyuki Takahashi ◽  
Naoki Soneda ◽  
Masanori Kikuchi

This paper describes a computer simulation of thermal ageing process in Fe-Cu alloy. In order to perform accurate numerical simulation, firstly, we make numerical models of the diffusion and dissociation of Cu and Cu-vacancy clusters. This modeling was performed with kinetic lattice Monte Carlo method, which allows us to perform long-time simulation of vacancy diffusion in Fe-Cu dilute alloy. The model is input to the kinetic Monte Carlo method, and then, we performed the kinetic Monte Carlo simulation of the thermal ageing in the Fe-Cu alloy. The results of the KMC simulations tell us that the our new models describes well the rate and kinetics of the diffusion and dissociation of Cu and Cu-vacancy clusters, and works well in the kinetic Monte Carlo simulations. Finally, we discussed the further application of these numerical models.


2005 ◽  
Vol 62 (11) ◽  
pp. 4010-4026 ◽  
Author(s):  
Vincent E. Larson ◽  
Jean-Christophe Golaz ◽  
Hongli Jiang ◽  
William R. Cotton

Abstract One problem in computing cloud microphysical processes in coarse-resolution numerical models is that many microphysical processes are nonlinear and small in scale. Consequently, there are inaccuracies if microphysics parameterizations are forced with grid box averages of model fields, such as liquid water content. Rather, the model needs to determine information about subgrid variability and input it into the microphysics parameterization. One possible solution is to assume the shape of the family of probability density functions (PDFs) associated with a grid box and sample it using the Monte Carlo method. In this method, the microphysics subroutine is called repeatedly, once with each sample point. In this way, the Monte Carlo method acts as an interface between the host model’s dynamics and the microphysical parameterization. This avoids the need to rewrite the microphysics subroutines. A difficulty with the Monte Carlo method is that it introduces into the simulation statistical noise or variance, associated with the finite sample size. If the family of PDFs is tractable, one can sample solely from cloud, thereby improving estimates of in-cloud processes. If one wishes to mitigate the noise further, one needs a method for reduction of variance. One such method is Latin hypercube sampling, which reduces noise by spreading out the sample points in a quasi-random fashion. This paper formulates a sampling interface based on the Latin hypercube method. The associated family of PDFs is assumed to be a joint normal/lognormal (i.e., Gaussian/lognormal) mixture. This method of variance reduction has a couple of advantages. First, the method is general: the same interface can be used with a wide variety of microphysical parameterizations for various processes. Second, the method is flexible: one can arbitrarily specify the number of hydrometeor categories and the number of calls to the microphysics parameterization per grid box per time step. This paper performs a preliminary test of Latin hypercube sampling. As a prototypical microphysical formula, this paper uses the Kessler autoconversion formula. The PDFs that are sampled are extracted diagnostically from large-eddy simulations (LES). Both stratocumulus and cumulus boundary layer cases are tested. In this diagnostic test, the Latin hypercube can produce somewhat less noisy time-averaged estimates of Kessler autoconversion than a traditional Monte Carlo estimate, with no additional calls to the microphysics parameterization. However, the instantaneous estimates are no less noisy. This paper leaves unanswered the question of whether the Latin hypercube method will work well in a prognostic, interactive cloud model, but this question will be addressed in a future manuscript.


1974 ◽  
Vol 22 ◽  
pp. 307 ◽  
Author(s):  
Zdenek Sekanina

AbstractIt is suggested that the outbursts of Periodic Comet Schwassmann-Wachmann 1 are triggered by impacts of interplanetary boulders on the surface of the comet’s nucleus. The existence of a cloud of such boulders in interplanetary space was predicted by Harwit (1967). We have used the hypothesis to calculate the characteristics of the outbursts – such as their mean rate, optically important dimensions of ejected debris, expansion velocity of the ejecta, maximum diameter of the expanding cloud before it fades out, and the magnitude of the accompanying orbital impulse – and found them reasonably consistent with observations, if the solid constituent of the comet is assumed in the form of a porous matrix of lowstrength meteoric material. A Monte Carlo method was applied to simulate the distributions of impacts, their directions and impact velocities.


Sign in / Sign up

Export Citation Format

Share Document