Coupling general physical environmental process models with specific question-driven ecological simulation models

2019 ◽  
Vol 405 ◽  
pp. 102-105 ◽  
Author(s):  
Tomasz E. Koralewski ◽  
John K. Westbrook ◽  
William E. Grant ◽  
Hsiao-Hsuan Wang
2014 ◽  
Vol 6 ◽  
pp. 217584 ◽  
Author(s):  
J. Schilp ◽  
C. Seidel ◽  
H. Krauss ◽  
J. Weirather

Process monitoring and modelling can contribute to fostering the industrial relevance of additive manufacturing. Process related temperature gradients and thermal inhomogeneities cause residual stresses, and distortions and influence the microstructure. Variations in wall thickness can cause heat accumulations. These occur predominantly in filigree part areas and can be detected by utilizing off-axis thermographic monitoring during the manufacturing process. In addition, numerical simulation models on the scale of whole parts can enable an analysis of temperature fields upstream to the build process. In a microscale domain, modelling of several exposed single hatches allows temperature investigations at a high spatial and temporal resolution. Within this paper, FEM-based micro- and macroscale modelling approaches as well as an experimental setup for thermographic monitoring are introduced. By discussing and comparing experimental data with simulation results in terms of temperature distributions both the potential of numerical approaches and the complexity of determining suitable computation time efficient process models are demonstrated. This paper contributes to the vision of adjusting the transient temperature field during manufacturing in order to improve the resulting part's quality by simulation based process design upstream to the build process and the inline process monitoring.


2020 ◽  
Author(s):  
Craig Stow

<p>The historical adoption of Bayesian approaches was limited by two main impediments: 1) the requirement for subjective prior information, and 2) the unavailability of analytical solutions for all but a few simple model forms. However, water quality modeling has always been subjective; selecting point values for model parameters, undertaking some “judicious diddling” to adjust them so that model output more closely matches observed data, and declaring the model to be “reasonable” is a long-standing practice. Water quality modeling in a Bayesian framework can actually reduce this subjectivity as it provides a rigorous and transparent approach for model parameter estimation. The second impediment, lack of analytical solutions, has for many applications, been largely reduced by the increasing availability of fast, cheap computing and concurrent evolution of efficient algorithms to sample the posterior distribution. In water quality modeling, however, the increasing computational availability may be reinforcing the dichotomy between probabilistic and “process-based” models. When I was a graduate student we couldn’t do both process and probability because computers weren’t fast enough. However, current computers unimaginably faster and we still rarely do both. It seems that our increasing computational capacity has been absorbed either in more complex and highly resolved, but still deterministic, process models, or more structurally complex probabilistic models (like hierarchical models) that are still light process. In principal, Bayes Theorem is quite general; any model could constitute the likelihood function, but practically, running Monte Carlo-based methods on simulation models that require hours, days, or even longer to run is not feasible. Developing models that capture the essential (and best understood processes) and that still allow a meaningful uncertainty analysis is an area that invites renewed attention.</p>


2006 ◽  
Vol 519-521 ◽  
pp. 15-24 ◽  
Author(s):  
Jürgen Hirsch

A new approach to improve existing and develop new simulation models and apply them in a sequence to simulate the complete production processes of Aluminium semi-finished products is described. The development has been a joint effort of academic and industrial partners developed in the frame of the VIR* European projects. It integrated advanced material models with industrial fabrication process models to predict the microstructures and properties in the complete production chain processes of Al sheet and profiles, i.e. by DC ingot casting, rolling and extrusion and analyze complex interactions of critical process parameters with the corresponding metallurgical mechanisms and predict the related material response and properties. The principles are discussed and examples are given for their successful application to simulate industrial fabrication processes.


2008 ◽  
Vol 130 (04) ◽  
pp. 24-26
Author(s):  
Jean Thilmany

This article discusses engineering initiatives to develop a metal cutting machine that helps program itself and trim manufacturing costs. The smart machine, which was predicted be ready for its debut by mid-2010, would save manufacturers’ time and expense by greatly reducing waste and by speeding the machine-cutting process. The smart machine, with the help of its onboard software, will know the best way to make a part. It will generate its own cutting tool path based on that information and its own tool list. The goal of this smart machine is to use software and hardware in a way to create a machine that constantly monitors itself and conveys vital information to the operator and process planner. Engineers are creating process and structural dynamics simulation models. A process planner would use the models to simulate various scenarios before manufacturing begins. The chapter also highlights that the overall smart machining challenge now is to develop sensors that will monitor a machine and give vital feedback to the process models that need that information.


2020 ◽  
Author(s):  
Alexander Fengler ◽  
Lakshmi N. Govindarajan ◽  
Tony Chen ◽  
Michael J. Frank

AbstractIn cognitive neuroscience, computational modeling can formally adjudicate between theories and affords quantitative fits to behavioral/brain data. Pragmatically, however, the space of plausible generative models considered is dramatically limited by the set of models with known likelihood functions. For many models, the lack of a closed-form likelihood typically impedes Bayesian inference methods. As a result, standard models are evaluated for convenience, even when other models might be superior. Likelihood-free methods exist but are limited by their computational cost or their restriction to particular inference scenarios. Here, we propose neural networks that learn approximate likelihoods for arbitrary generative models, allowing fast posterior sampling with only a one-off cost for model simulations that is amortized for future inference. We show that these methods can accurately recover posterior parameter distributions for a variety of neurocognitive process models. We provide code allowing users to deploy these methods for arbitrary hierarchical model instantiations without further training.


2021 ◽  
Vol 7 ◽  
pp. e577
Author(s):  
Manuel Camargo ◽  
Marlon Dumas ◽  
Oscar González-Rojas

A generative model is a statistical model capable of generating new data instances from previously observed ones. In the context of business processes, a generative model creates new execution traces from a set of historical traces, also known as an event log. Two types of generative business process models have been developed in previous work: data-driven simulation models and deep learning models. Until now, these two approaches have evolved independently, and their relative performance has not been studied. This paper fills this gap by empirically comparing a data-driven simulation approach with multiple deep learning approaches for building generative business process models. The study sheds light on the relative strengths of these two approaches and raises the prospect of developing hybrid approaches that combine these strengths.


2020 ◽  
Vol 110 (09) ◽  
pp. 591-596
Author(s):  
Daniella Brovkina ◽  
Oliver Riedel

Die virtuelle Inbetriebnahme repräsentiert eine etablierte Phase des Lebenszyklus eines modernen Produktionssystems, in der die Definition von Simulationsmodellen eine Schlüsselrolle spielt. Im Falle von Montagelinien erfolgt die Layoutplanung in Iterationen mit geringem Automatisierungsgrad, wodurch die Phase des Engineerings sowie eine anschließende Erstellung des virtuellen Inbetriebnahme-Modells verlangsamt wird. In diesem Beitrag wird ein Konzept für einen modellbasierten Ansatz zur vollautomatisierten Planung von Montagelinien vorgestellt, womit eine automatisierte Modellerstellung für die virtuelle Inbetriebnahme durch eine Zuordnung von Montageprozessmodellen bezüglich kompatiblen Fähigkeiten der Betriebsmittel erlaubt wird.   Virtual commissioning represents an established phase in the life cycle of a modern production system, where the definition of simulation models plays a key role. In the case of assembly lines, layout planning is done in iterations with a low degree of automation, slowing down the engineering phase, and subsequent creation of the virtual commissioning model. In this paper, a concept for a model-based approach to fully automated planning of assembly lines is presented, enabling automated model generation for virtual commissioning by mapping assembly process models to compatible capabilities of the equipment.


Sign in / Sign up

Export Citation Format

Share Document