Overview of Enhanced Continuum Theories for Thermal and Mechanical Responses of the Microsystems in the Fast-Transient Process

Author(s):  
George Z. Voyiadjis ◽  
Danial Faghihi

The recently growing demand for production and applications of microscale devices and systems has motivated research on the behavior of small volume materials. The computational models have become one of great interests in order to advance the manufacturing of microdevices and to reduce the time to insert new product in applications. Among the various numerical and computational techniques, still the approaches in the context of continuum theories are more preferable due to their minimum computational cost to simulation on realistic time and material structures. This paper reviews the methods to address the thermal and mechanical responses of microsystems. The focus is on the recent developments on the enhanced continuum theories to address the phenomena such as size and boundary effects as well as microscale heat transfer. The thermodynamic consistency of the theories is discussed and microstructural mechanisms are taken into account as physical justification of the framework. The presented constitutive model is calibrated using an extensive set of microscale experimental measurements of thin metal films over a wide range of size and temperature of the samples. An energy based approach is presented to extract the first estimate of the interface model parameters from results of nanoindentation test.

2018 ◽  
Vol 25 (4) ◽  
pp. 1135-1143 ◽  
Author(s):  
Faisal Khan ◽  
Suresh Narayanan ◽  
Roger Sersted ◽  
Nicholas Schwarz ◽  
Alec Sandy

Multi-speckle X-ray photon correlation spectroscopy (XPCS) is a powerful technique for characterizing the dynamic nature of complex materials over a range of time scales. XPCS has been successfully applied to study a wide range of systems. Recent developments in higher-frame-rate detectors, while aiding in the study of faster dynamical processes, creates large amounts of data that require parallel computational techniques to process in near real-time. Here, an implementation of the multi-tau and two-time autocorrelation algorithms using the Hadoop MapReduce framework for distributed computing is presented. The system scales well with regard to the increase in the data size, and has been serving the users of beamline 8-ID-I at the Advanced Photon Source for near real-time autocorrelations for the past five years.


Author(s):  
Hiroki Yamashita ◽  
Guanchu Chen ◽  
Yeefeng Ruan ◽  
Paramsothy Jayakumar ◽  
Hiroyuki Sugiyama

A high-fidelity computational terrain dynamics model plays a crucial role in accurate vehicle mobility performance prediction under various maneuvering scenarios on deformable terrain. Although many computational models have been proposed using either finite element (FE) or discrete element (DE) approaches, phenomenological constitutive assumptions in FE soil models make the modeling of complex granular terrain behavior very difficult and DE soil models are computationally intensive, especially when considering a wide range of terrain. To address the limitations of existing deformable terrain models, this paper presents a hierarchical FE–DE multiscale tire–soil interaction simulation capability that can be integrated in the monolithic multibody dynamics solver for high-fidelity off-road mobility simulation using high-performance computing (HPC) techniques. It is demonstrated that computational cost is substantially lowered by the multiscale soil model as compared to the corresponding pure DE model while maintaining the solution accuracy. The multiscale tire–soil interaction model is validated against the soil bin mobility test data under various wheel load and tire inflation pressure conditions, thereby demonstrating the potential of the proposed method for resolving challenging vehicle-terrain interaction problems.


Author(s):  
F. Cosandey ◽  
S.-W. Chan ◽  
P. Stadelmann

Until recently most of the information concerning the atomic structure of grain boundaries in metals has been obtained using molecular static and molecular dynamic computational techniques. With the recent developments of intermediate voltage microscope (300-400KV) this situation has changed and grain boundary atomic resolution is now possible for most metals. The purpose of this research is to examine the atomic structure of Σ=5 tilt boundaries in Au by high resolution microscopy and to compare the results to computational models.Thin film Au bicrystals containing Σ=5 (θ=36.5°±0.5) tilt grain boundaries were produced by epitaxial growth on NaCl bicrystalline substrates using a technique described in detail elsewhere. All high resolution images were obtained with a Philips 430 ST microscope using axial illumination and without objective aperture. All image simulations were obtained using the multislice formalism with EMS programs. All four {200} reflections from each crystal were used for the simulations with the following instrumental parameters; accelerating voltage V=300KV, spherical aberration constant Cs=1.1mm, defocus spread Δ=8nm and semi-angle beam divergence α=8mrad.


2019 ◽  
Vol 42 (2) ◽  
pp. 167-183
Author(s):  
Haroon M. Barakat ◽  
Abdallh W. Aboutahoun ◽  
Naeema El-kadar

One of the most important property of the mixture normal distributions-model is its flexibility to accommodate various types of distribution functions (df's). We show that the mixture of the skew normal distribution and its reverse, after adding a location parameter to the skew normal distribution, and adding the same location parameter with different sign to its reverse is a family of df's that contains all the possible types of df's. Besides, it has a very remarkable wide range of the indices of skewness and kurtosis. Computational techniques using EM-type algorithms are employed for iteratively computing maximum likelihood estimates of the model parameters. Moreover, an application with a body mass index real data set is presented.


2020 ◽  
Author(s):  
Jingbai Li ◽  
Patrick Reiser ◽  
André Eberhard ◽  
Pascal Friederich ◽  
Steven Lopez

<p>Photochemical reactions are being increasingly used to construct complex molecular architectures with mild and straightforward reaction conditions. Computational techniques are increasingly important to understand the reactivities and chemoselectivities of photochemical isomerization reactions because they offer molecular bonding information along the excited-state(s) of photodynamics. These photodynamics simulations are resource-intensive and are typically limited to 1–10 picoseconds and 1,000 trajectories due to high computational cost. Most organic photochemical reactions have excited-state lifetimes exceeding 1 picosecond, which places them outside possible computational studies. Westermeyr <i>et al.</i> demonstrated that a machine learning approach could significantly lengthen photodynamics simulation times for a model system, methylenimmonium cation (CH<sub>2</sub>NH<sub>2</sub><sup>+</sup>).</p><p>We have developed a Python-based code, Python Rapid Artificial Intelligence <i>Ab Initio</i> Molecular Dynamics (PyRAI<sup>2</sup>MD), to accomplish the unprecedented 10 ns <i>cis-trans</i> photodynamics of <i>trans</i>-hexafluoro-2-butene (CF<sub>3</sub>–CH=CH–CF<sub>3</sub>) in 3.5 days. The same simulation would take approximately 58 years with ground-truth multiconfigurational dynamics. We proposed an innovative scheme combining Wigner sampling, geometrical interpolations, and short-time quantum chemical trajectories to effectively sample the initial data, facilitating the adaptive sampling to generate an informative and data-efficient training set with 6,232 data points. Our neural networks achieved chemical accuracy (mean absolute error of 0.032 eV). Our 4,814 trajectories reproduced the S<sub>1</sub> half-life (60.5 fs), the photochemical product ratio (<i>trans</i>: <i>cis</i> = 2.3: 1), and autonomously discovered a pathway towards a carbene. The neural networks have also shown the capability of generalizing the full potential energy surface with chemically incomplete data (<i>trans</i> → <i>cis</i> but not <i>cis</i> → <i>trans</i> pathways) that may offer future automated photochemical reaction discoveries.</p>


2020 ◽  
Vol 9 (3) ◽  
pp. 177-191
Author(s):  
Sridharan Priya ◽  
Radha K. Manavalan

Background: The diseases in the heart and blood vessels such as heart attack, Coronary Artery Disease, Myocardial Infarction (MI), High Blood Pressure, and Obesity, are generally referred to as Cardiovascular Diseases (CVD). The risk factors of CVD include gender, age, cholesterol/ LDL, family history, hypertension, smoking, and genetic and environmental factors. Genome- Wide Association Studies (GWAS) focus on identifying the genetic interactions and genetic architectures of CVD. Objective: Genetic interactions or Epistasis infer the interactions between two or more genes where one gene masks the traits of another gene and increases the susceptibility of CVD. To identify the Epistasis relationship through biological or laboratory methods needs an enormous workforce and more cost. Hence, this paper presents the review of various statistical and Machine learning approaches so far proposed to detect genetic interaction effects for the identification of various Cardiovascular diseases such as Coronary Artery Disease (CAD), MI, Hypertension, HDL and Lipid phenotypes data, and Body Mass Index dataset. Conclusion: This study reveals that various computational models identified the candidate genes such as AGT, PAI-1, ACE, PTPN22, MTHR, FAM107B, ZNF107, PON1, PON2, GTF2E1, ADGRB3, and FTO, which play a major role in genetic interactions for the causes of CVDs. The benefits, limitations, and issues of the various computational techniques for the evolution of epistasis responsible for cardiovascular diseases are exhibited.


The recycling and reuse of materials and objects were extensive in the past, but have rarely been embedded into models of the economy; even more rarely has any attempt been made to assess the scale of these practices. Recent developments, including the use of large datasets, computational modelling, and high-resolution analytical chemistry, are increasingly offering the means to reconstruct recycling and reuse, and even to approach the thorny matter of quantification. Growing scholarly interest in the topic has also led to an increasing recognition of these practices from those employing more traditional methodological approaches, which are sometimes coupled with innovative archaeological theory. Thanks to these efforts, it has been possible for the first time in this volume to draw together archaeological case studies on the recycling and reuse of a wide range of materials, from papyri and textiles, to amphorae, metals and glass, building materials and statuary. Recycling and reuse occur at a range of site types, and often in contexts which cross-cut material categories, or move from one object category to another. The volume focuses principally on the Roman Imperial and late antique world, over a broad geographical span ranging from Britain to North Africa and the East Mediterranean. Last, but not least, the volume is unique in focusing upon these activities as a part of the status quo, and not just as a response to crisis.


Polymers ◽  
2020 ◽  
Vol 12 (10) ◽  
pp. 2237 ◽  
Author(s):  
P. R. Sarika ◽  
Paul Nancarrow ◽  
Abdulrahman Khansaheb ◽  
Taleb Ibrahim

Phenol–formaldehyde (PF) resin continues to dominate the resin industry more than 100 years after its first synthesis. Its versatile properties such as thermal stability, chemical resistance, fire resistance, and dimensional stability make it a suitable material for a wide range of applications. PF resins have been used in the wood industry as adhesives, in paints and coatings, and in the aerospace, construction, and building industries as composites and foams. Currently, petroleum is the key source of raw materials used in manufacturing PF resin. However, increasing environmental pollution and fossil fuel depletion have driven industries to seek sustainable alternatives to petroleum based raw materials. Over the past decade, researchers have replaced phenol and formaldehyde with sustainable materials such as lignin, tannin, cardanol, hydroxymethylfurfural, and glyoxal to produce bio-based PF resin. Several synthesis modifications are currently under investigation towards improving the properties of bio-based phenolic resin. This review discusses recent developments in the synthesis of PF resins, particularly those created from sustainable raw material substitutes, and modifications applied to the synthetic route in order to improve the mechanical properties.


Genetics ◽  
2000 ◽  
Vol 156 (1) ◽  
pp. 457-467 ◽  
Author(s):  
Z W Luo ◽  
S H Tao ◽  
Z-B Zeng

Abstract Three approaches are proposed in this study for detecting or estimating linkage disequilibrium between a polymorphic marker locus and a locus affecting quantitative genetic variation using the sample from random mating populations. It is shown that the disequilibrium over a wide range of circumstances may be detected with a power of 80% by using phenotypic records and marker genotypes of a few hundred individuals. Comparison of ANOVA and regression methods in this article to the transmission disequilibrium test (TDT) shows that, given the genetic variance explained by the trait locus, the power of TDT depends on the trait allele frequency, whereas the power of ANOVA and regression analyses is relatively independent from the allelic frequency. The TDT method is more powerful when the trait allele frequency is low, but much less powerful when it is high. The likelihood analysis provides reliable estimation of the model parameters when the QTL variance is at least 10% of the phenotypic variance and the sample size of a few hundred is used. Potential use of these estimates in mapping the trait locus is also discussed.


Author(s):  
Hernâni Marques ◽  
Pedro Cruz-Vicente ◽  
Tiago Rosado ◽  
Mário Barroso ◽  
Luís A. Passarinha ◽  
...  

Environmental tobacco smoke exposure (ETS) and smoking have been described as the most prevalent factors in the development of certain diseases worldwide. According to the World Health Organization, more than 8 million people die every year due to exposure to tobacco, around 7 million due to direct ETS and the remaining due to exposure to second-hand smoke. Both active and second-hand exposure can be measured and controlled using specific biomarkers of tobacco and its derivatives, allowing the development of more efficient public health policies. Exposure to these compounds can be measured using different methods (involving for instance liquid- or gas-chromatographic procedures) in a wide range of biological specimens to estimate the type and degree of tobacco exposure. In recent years, a lot of research has been carried out using different extraction methods and different analytical equipment; this way, liquid–liquid extraction, solid-phase extraction or even miniaturized procedures have been used, followed by chromatographic analysis coupled mainly to mass spectrometric detection. Through this type of methodologies, second-hand smokers can be distinguished from active smokers, and this is also valid for e-cigarettes and vapers, among others, using their specific biomarkers. This review will focus on recent developments in the determination of tobacco smoke biomarkers, including nicotine and other tobacco alkaloids, specific nitrosamines, polycyclic aromatic hydrocarbons, etc. The methods for their detection will be discussed in detail, as well as the potential use of threshold values to distinguish between types of exposure.


Sign in / Sign up

Export Citation Format

Share Document