simulation based
Recently Published Documents





2022 ◽  
Vol 72 ◽  
pp. 101117
Julius Meier ◽  
Luca Spliethoff ◽  
Peter Hesse ◽  
Stephan Abele ◽  
Alexander Renkl ◽  

2022 ◽  
Vol 145 ◽  
pp. 104110
C. Binnersley ◽  
S.F. Ashley ◽  
P. Chard ◽  
A. Lansdell ◽  
G. O'Brien ◽  

2022 ◽  
Vol 18 (2) ◽  
pp. 1-24
Sourabh Kulkarni ◽  
Mario Michael Krell ◽  
Seth Nabarro ◽  
Csaba Andras Moritz

Epidemiology models are central to understanding and controlling large-scale pandemics. Several epidemiology models require simulation-based inference such as Approximate Bayesian Computation (ABC) to fit their parameters to observations. ABC inference is highly amenable to efficient hardware acceleration. In this work, we develop parallel ABC inference of a stochastic epidemiology model for COVID-19. The statistical inference framework is implemented and compared on Intel’s Xeon CPU, NVIDIA’s Tesla V100 GPU, Google’s V2 Tensor Processing Unit (TPU), and the Graphcore’s Mk1 Intelligence Processing Unit (IPU), and the results are discussed in the context of their computational architectures. Results show that TPUs are 3×, GPUs are 4×, and IPUs are 30× faster than Xeon CPUs. Extensive performance analysis indicates that the difference between IPU and GPU can be attributed to higher communication bandwidth, closeness of memory to compute, and higher compute power in the IPU. The proposed framework scales across 16 IPUs, with scaling overhead not exceeding 8% for the experiments performed. We present an example of our framework in practice, performing inference on the epidemiology model across three countries and giving a brief overview of the results.

2022 ◽  
Marie-Therese Valovska ◽  
Gricelda Gomez ◽  
Richard Fineman ◽  
William Woltmann ◽  
Leia Stirling ◽  

2022 ◽  
Vol 27 (1) ◽  
pp. 7
Monika Stipsitz ◽  
Hèlios Sanchis-Alepuz

Thermal simulations are an important part of the design process in many engineering disciplines. In simulation-based design approaches, a considerable amount of time is spent by repeated simulations. An alternative, fast simulation tool would be a welcome addition to any automatized and simulation-based optimisation workflow. In this work, we present a proof-of-concept study of the application of convolutional neural networks to accelerate thermal simulations. We focus on the thermal aspect of electronic systems. The goal of such a tool is to provide accurate approximations of a full solution, in order to quickly select promising designs for more detailed investigations. Based on a training set of randomly generated circuits with corresponding finite element solutions, the full 3D steady-state temperature field is estimated using a fully convolutional neural network. A custom network architecture is proposed which captures the long-range correlations present in heat conduction problems. We test the network on a separate dataset and find that the mean relative error is around 2% and the typical evaluation time is 35 ms per sample ( 2 ms for evaluation, 33 ms for data transfer). The benefit of this neural-network-based approach is that, once training is completed, the network can be applied to any system within the design space spanned by the randomized training dataset (which includes different components, material properties, different positioning of components on a PCB, etc.).

Tobias Rye Torben ◽  
Jon Arne Glomsrud ◽  
Tom Arne Pedersen ◽  
Ingrid B Utne ◽  
Asgeir J Sørensen

A methodology for automatic simulation-based testing of control systems for autonomous vessels is proposed. The work is motivated by the need for increased test coverage and formalism in the verification efforts. It aims to achieve this by formulating requirements in the formal logic Signal Temporal Logic (STL). This enables automatic evaluation of simulations against requirements using the STL robustness metric, resulting in a robustness score for requirements satisfaction. Furthermore, the proposed method uses a Gaussian Process (GP) model for estimating robustness scores including levels of uncertainty for untested cases. The GP model is updated by running simulations and observing the resulting robustness, and its estimates are used to automatically guide the test case selection toward cases with low robustness or high uncertainty. The main scientific contribution is the development of an automatic testing method which incrementally runs new simulations until the entire parameter space of the case is covered to the desired confidence level, or until a case which falsifies the requirement is identified. The methodology is demonstrated through a case study, where the test object is a Collision Avoidance (CA) system for a small high-speed vessel. STL requirements for safety distance, mission compliance, and COLREG compliance are developed. The proposed method shows promise, by both achieving verification in feasible time and identifying falsifying behaviors which would be difficult to detect manually or using brute-force methods. An additional contribution of this work is a formalization of COLREG using temporal logic, which appears to be an interesting direction for future work.

Sign in / Sign up

Export Citation Format

Share Document