scholarly journals Tutorial: Parallel Computing of Simulation Models for Risk Analysis

Risk Analysis ◽  
2016 ◽  
Vol 36 (10) ◽  
pp. 1844-1854 ◽  
Author(s):  
Allison C. Reilly ◽  
Andrea Staid ◽  
Michael Gao ◽  
Seth D. Guikema





1981 ◽  
Vol 24 (4) ◽  
pp. 190-197 ◽  
Author(s):  
Osman Balci ◽  
Robert G. Sargent


2015 ◽  
Vol 6 (2) ◽  
pp. 82-103 ◽  
Author(s):  
Juho Roponen ◽  
Ahti Salo

Abstract Adversarial Risk Analysis (ARA) builds on statistical risk analysis and game theory to analyze decision situations involving two or more intelligent opponents who make decisions under uncertainty. During the past few years, the ARA approach-which is based on the explicit modelling of the decision making processes of a rational opponent-has been applied extensively in areas such as counterterrorism and corporate competition. In the context of military combat modelling, however, ARA has not been used systematically, even if there have been attempts to predict the opponent’s decisions based on wargaming, application of game theoretic equilibria, and the use of expert judgements. Against this backdrop, we argue that combining ARA with military combat modelling holds promise for enhancing the capabilities of combat modelling tools. We identify ways of combining ARA with combat modelling and give an illustrative example of how ARA can provide insights into a problem where the defender needs to estimate the utility gained from hiding its troop movements from the attacker. Even if the ARA approach can be challenging to apply, it can be instructive in that relevant assumptions about the resources, expectations and goals that guide the adversary’s decisions must be explicated.



Author(s):  
Nasser Alizadeh Tabrizi

Running simulation models is CPU intensive. In computing expensive tasks such as parameter screening, sensitivity and risk analysis (uncertainty analysis) and production optimization, it can be useful to establish a simple surrogate model (proxy model) that mimics the simulation model with regard to a specific target value (for example, total production) in order to reduce the computing time and to study the available uncertainties in the reservoir and their impacts on production. Artificial Neural Networks (ANN) are one of the main tools used in machine learning. The quality of the ANN as a proxy model is dependent on how the experiments that were used to make and train it are designed. In particular, it is crucial to understand the input parameters such that their respective dependencies, correlations, and ranges are incorporated in the modelling. A combination of simulation runs should be set up that can be used to train the ANN. This task is usually referred to as the design of experiments (DOE) which gives the most informative data sets to train ANN. In this study DOE was used to train the ANN in an oil reservoir under gas injection scenario and the trained ANN, in turn, was applied to create the production profiles which were further used for risk analysis. The accuracy of the results obtained in this study indicates that ANN as a proxy model combined with DOE as a sampling method for training purpose is a fast and reliable tool that can replace the simulator. This dynamic proxy model can be used for risk analysis, production optimization and production forecasting of oil reservoirs under Enhanced Oil Recovery (EOR) methods.



2001 ◽  
Vol 34 (29) ◽  
pp. 28-33
Author(s):  
Konstantin N. Nechval ◽  
Nicholas A. Nechval ◽  
Edgars K. Vasermanis


2007 ◽  
Vol 47 (1) ◽  
pp. 199
Author(s):  
M. Asadullah ◽  
P. Behrenbruch ◽  
S. Pham

Simulation of petroleum reservoirs is becoming more and more complex due to increasing necessity to model heterogeneity of reservoirs for accurate reservoir performance prediction. With high oil prices and less easy oil, accurate reservoir management tools such as simulation models are in more demand than ever before. The aim is to capture and preserve reservoir heterogeneity when changing over from a detailed geocellular model to a flow simulation model, minimising errors when upscaling and preventing excessive numerical dispersion by employing variable and innovative grids, as well as improved computational algorithms.For accurate and efficient simulation of large-scale models there are essentially three choices: upscaling, which involves averaging of parameters for several blocks, resulting in a coarser model that executes faster; the use of streamline simulation, which uses a more optimal grid, combined with a different computational algorithm for increased efficiency; and, the use of parallel computing techniques, which use superior hardware configurations for efficiency gains. With uncertainty screening of various multiple geostatistical realisations and investigation of alternative development scenarios— now commonplace for determining reservoir performance—computational efficiency and accuracy in modelling are paramount. This paper summarises the main techniques and methodologies involved in considering geocellular models for flow simulation of reservoirs, commenting on advantages and disadvantages among the various possibilities. Starting with some historic comments, the three modes of simulation are reviewed and examples are given for illustrative purposes, including a case history for the Bayu-Undan Field, Timor Sea.



Sign in / Sign up

Export Citation Format

Share Document