scholarly journals 3-D TCAD Monte Carlo Device Simulator: State-of-the-art FinFET Simulation

2021 ◽  
Vol 16 (2) ◽  
pp. 1-11
Author(s):  
Gabriela Firpo Furtado ◽  
Vinícius Valduga de Almeida Camargo ◽  
Dragica Vasileska ◽  
Gilson Inácio Wirth

This work presents a comprehensive description of an in-house 3D Monte Carlo device simulator for physical mod-eling of FinFETs. The simulator was developed to consider var-iability effects properly and to be able to study deeply scaled devices operating in the ballistic and quasi-ballistic regimes. The impact of random dopants and trapped charges in the die-lectric is considered by treating electron-electron and electron-ion interactions in real-space. Metal gate granularity is in-cluded through the gate work function variation. The capability to evaluate these effects in nanometer 3D devices makes the pre-sented simulator unique, thus advancing the state-of-the-art. The phonon scattering mechanisms, used to model the transport of electrons in pure silicon material system, were validated by comparing simulated drift velocities with available experi-mental data. The proper behavior of the device simulator is dis-played in a series of studies of the electric potential in the device, the electron density, the carrier's energy and velocity, and the Id-Vg and Id-Vd curves.

2021 ◽  
Vol 2021 (7) ◽  
Author(s):  
Simone Caletti ◽  
Oleh Fedkevych ◽  
Simone Marzani ◽  
Daniel Reichelt ◽  
Steffen Schumann ◽  
...  

Abstract We present a phenomenological study of angularities measured on the highest transverse-momentum jet in LHC events that feature the associate production of a Z boson and one or more jets. In particular, we study angularity distributions that are measured on jets with and without the SoftDrop grooming procedure. We begin our analysis exploiting state-of-the-art Monte Carlo parton shower simulations and we quantitatively assess the impact of next-to-leading order (NLO) matching and merging procedures. We then move to analytic resummation and arrive at an all-order expression that features the resummation of large logarithms at next-to-leading logarithmic accuracy (NLL) and is matched to the exact NLO result. Our predictions include the effect of soft emissions at large angles, treated as a power expansion in the jet radius, and non-global logarithms. Furthermore, matching to fixed-order is performed in such a way to ensure what is usually referred to as NLL′ accuracy. Our results account for realistic experimental cuts and can be easily compared to upcoming measurements of jet angularities from the LHC collaborations.


1995 ◽  
Vol 395 ◽  
Author(s):  
J. Kolnik ◽  
I.H. Oguzman ◽  
K.F. Brennan ◽  
R. Wang ◽  
P.P. Ruden

ABSTRACTIn this paper, we present ensemble Monte Carlo based calculations of electron initiated impact ionization in bulk zincblende GaN using a wavevector dependent formulation of the interband impact ionization transition rate. These are the first reported estimates, either theoretical or experimental, of the impact ionization rates in GaN. The transition rate is determined from Fermi’s golden rule for a two-body screened Coulomb interaction using a numerically determined dielectric function as well as by numerically integrating over all of the possible final states. The Monte Carlo simulator includes the full details of the first four conduction bands derived from an empirical pseudopotential calculation as well as all of the relevant phonon scattering mechanisms. It is found that the ionization rate has a relatively "soft" threshold.


Author(s):  
Vladimíra Osadská

Abstract In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.


Author(s):  
Ginestra Bianconi

Defining the centrality of nodes and layers in multilayer networks is of fundamental importance for a variety of applications from sociology to biology and finance. This chapter presents the state-of-the-art centrality measures able to characterize the centrality of nodes, the influences of layers or the centrality of replica nodes in multilayer and multiplex networks. These centrality measures include modifications of the eigenvector centrality, Katz centrality, PageRank centrality and Communicability to the multilayer network scenario. The chapter provides a comprehensive description of the research of the field and discusses the main advantages and limitations of the different definitions, allowing the readers that wish to apply these techniques to choose the most suitable definition for his or her case study.


Metals ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 250
Author(s):  
Jiří Hájek ◽  
Zaneta Dlouha ◽  
Vojtěch Průcha

This article is a response to the state of the art in monitoring the cooling capacity of quenching oils in industrial practice. Very often, a hardening shop requires a report with data on the cooling process for a particular quenching oil. However, the interpretation of the data can be rather difficult. The main goal of our work was to compare various criteria used for evaluating quenching oils. Those of which prove essential for operation in tempering plants would then be introduced into practice. Furthermore, the article describes monitoring the changes in the properties of a quenching oil used in a hardening shop, the effects of quenching oil temperature on its cooling capacity and the impact of the water content on certain cooling parameters of selected oils. Cooling curves were measured (including cooling rates and the time to reach relevant temperatures) according to ISO 9950. The hardening power of the oil and the area below the cooling rate curve as a function of temperature (amount of heat removed in the nose region of the Continuous cooling transformation - CCT curve) were calculated. V-values based on the work of Tamura, reflecting the steel type and its CCT curve, were calculated as well. All the data were compared against the hardness and microstructure on a section through a cylinder made of EN C35 steel cooled in the particular oil. Based on the results, criteria are recommended for assessing the suitability of a quenching oil for a specific steel grade and product size. The quenching oils used in the experiment were Houghto Quench C120, Paramo TK 22, Paramo TK 46, CS Noro MO 46 and Durixol W72.


Author(s):  
Florian Kuisat ◽  
Fernando Lasagni ◽  
Andrés Fabián Lasagni

AbstractIt is well known that the surface topography of a part can affect its mechanical performance, which is typical in additive manufacturing. In this context, we report about the surface modification of additive manufactured components made of Titanium 64 (Ti64) and Scalmalloy®, using a pulsed laser, with the aim of reducing their surface roughness. In our experiments, a nanosecond-pulsed infrared laser source with variable pulse durations between 8 and 200 ns was applied. The impact of varying a large number of parameters on the surface quality of the smoothed areas was investigated. The results demonstrated a reduction of surface roughness Sa by more than 80% for Titanium 64 and by 65% for Scalmalloy® samples. This allows to extend the applicability of additive manufactured components beyond the current state of the art and break new ground for the application in various industrial applications such as in aerospace.


Author(s):  
Sebastian Eisele ◽  
Fabian M. Draber ◽  
Steffen Grieshammer

First principles calculations and Monte Carlo simulations reveal the impact of defect interactions on the hydration of barium-zirconate.


Author(s):  
Stephan Schlupkothen ◽  
Gerd Ascheid

Abstract The localization of multiple wireless agents via, for example, distance and/or bearing measurements is challenging, particularly if relying on beacon-to-agent measurements alone is insufficient to guarantee accurate localization. In these cases, agent-to-agent measurements also need to be considered to improve the localization quality. In the context of particle filtering, the computational complexity of tracking many wireless agents is high when relying on conventional schemes. This is because in such schemes, all agents’ states are estimated simultaneously using a single filter. To overcome this problem, the concept of multiple particle filtering (MPF), in which an individual filter is used for each agent, has been proposed in the literature. However, due to the necessity of considering agent-to-agent measurements, additional effort is required to derive information on each individual filter from the available likelihoods. This is necessary because the distance and bearing measurements naturally depend on the states of two agents, which, in MPF, are estimated by two separate filters. Because the required likelihood cannot be analytically derived in general, an approximation is needed. To this end, this work extends current state-of-the-art likelihood approximation techniques based on Gaussian approximation under the assumption that the number of agents to be tracked is fixed and known. Moreover, a novel likelihood approximation method is proposed that enables efficient and accurate tracking. The simulations show that the proposed method achieves up to 22% higher accuracy with the same computational complexity as that of existing methods. Thus, efficient and accurate tracking of wireless agents is achieved.


Sign in / Sign up

Export Citation Format

Share Document