Closing the loop - Seismic modelling from reservoir simulation results

Author(s):  
P. Gutteridge ◽  
D. Gawith ◽  
Z. Tang
2021 ◽  
Author(s):  
Mokhles Mezghani ◽  
Mustafa AlIbrahim ◽  
Majdi Baddourah

Abstract Reservoir simulation is a key tool for predicting the dynamic behavior of the reservoir and optimizing its development. Fine scale CPU demanding simulation grids are necessary to improve the accuracy of the simulation results. We propose a hybrid modeling approach to minimize the weight of the full physics model by dynamically building and updating an artificial intelligence (AI) based model. The AI model can be used to quickly mimic the full physics (FP) model. The methodology that we propose consists of starting with running the FP model, an associated AI model is systematically updated using the newly performed FP runs. Once the mismatch between the two models is below a predefined cutoff the FP model is switch off and only the AI model is used. The FP model is switched on at the end of the exercise either to confirm the AI model decision and stop the study or to reject this decision (high mismatch between FP and AI model) and upgrade the AI model. The proposed workflow was applied to a synthetic reservoir model, where the objective is to match the average reservoir pressure. For this study, to better account for reservoir heterogeneity, fine scale simulation grid (approximately 50 million cells) is necessary to improve the accuracy of the reservoir simulation results. Reservoir simulation using FP model and 1024 CPUs requires approximately 14 hours. During this history matching exercise, six parameters have been selected to be part of the optimization loop. Therefore, a Latin Hypercube Sampling (LHS) using seven FP runs is used to initiate the hybrid approach and build the first AI model. During history matching, only the AI model is used. At the convergence of the optimization loop, a final FP model run is performed either to confirm the convergence for the FP model or to re iterate the same approach starting from the LHS around the converged solution. The following AI model will be updated using all the FP simulations done in the study. This approach allows the achievement of the history matching with very acceptable quality match, however with much less computational resources and CPU time. CPU intensive, multimillion-cell simulation models are commonly utilized in reservoir development. Completing a reservoir study in acceptable timeframe is a real challenge for such a situation. The development of new concepts/techniques is a real need to successfully complete a reservoir study. The hybrid approach that we are proposing is showing very promising results to handle such a challenge.


1992 ◽  
Author(s):  
P.F. Johnston ◽  
G.R. Andersen ◽  
Noboru Wachi ◽  
D.S. Lee ◽  
F.G. Martens ◽  
...  

2018 ◽  
Vol 163 ◽  
pp. 270-282 ◽  
Author(s):  
Yacine Debbabi ◽  
David Stern ◽  
Gary J. Hampson ◽  
Matthew D. Jackson

Author(s):  
Anita Theresa Panjaitan ◽  
Rachmat Sudibjo ◽  
Sri Fenny

<p>Y Field which located around 28 km south east of Jakarta was discovered in 1989. Three wells have been drilled and suspended. The initial gas ini place (IGIP) of the field is 40.53 BSCF. The field will be developed in 2011. In this study, reservoir simulation model was made to predict the optimum development strategy of the field. This model consisted of 1,575,064 grid cells which were built in a black oil simulator. Two field development scenarios were defined with and without compressor. Simulation results show that the Recovery Factor at thel end of the contract is 61.40% and 62.14% respectively for Scenarios I and II without compressor. When compressor is applied then Recovey Factor of Scenarios I and II is 68.78% and 74.58%, correspondingly. Based on the economic parameters, Scenario II with compressor is the most <br />attractive case, where IRR, POT, and NPV of the scenario are 41%, 2.9 years, and 14,808 MUS$.</p>


1986 ◽  
Vol 26 (1) ◽  
pp. 397
Author(s):  
A.B. Kaliszewski

The Hutton reservoir in the Merrimelia Field (Cooper-Eromanga Basin) was the subject of a 3-D reservoir simulation study. The primary objective of the study was to develop a reservoir management tool for evaluating the performance of the field under various depletion options.The study confirmed that the ultimate oil recovery from this strong water drive reservoir was not adversely affected by increasing total fluid offtake rate. However, any decisions regarding changes to the depletion scheme such as increasing production rates, if based solely on computer simulation results, should be viewed with caution. Careful monitoring of any changes to the depletion philosophy and checking of actual data against simulation predictions are essential to ensure that oil production rate and ultimate recovery are optimised.The model assisted in evaluating the economics of development drilling. While the simulation results are dependent on the validity of geological mapping, the model was useful in confirming that, due to very high transmissibility in the Hutton reservoir, additional wells would only accelerate production rather than increase ultimate recovery. The issue of drilling wells thus became one of balancing the benefits of accelerating production against the geological risk associated with that well.Interaction between the reservoir engineer and various disciplines, particularly development geology, is critical in the development and application of a good working simulation model. This was found to be especially important during the history matching phase in the study. If engineers and development geologists can learn more of the others' discipline and appreciate the role that each has to play in simulation studies, the validity of such models can only be improved.The paper addresses a number of the pitfalls commonly encountered in application of reservoir simulation results.


1997 ◽  
Author(s):  
J. W. Watts

Abstract Reservoir simulation is a mature technology, and nearly all major reservoir development decisions are based in some way on simulation results. Despite this maturity, the technology is changing rapidly. It is important for both providers and users of reservoir simulation software to understand where this change is leading. This paper takes a long-term view of reservoir simulation, describing where it has been and where it is now. It closes with a prediction of what the reservoir simulation state of the art will be in 2007 and speculation regarding certain aspects of simulation in 2017. Introduction Today, input from reservoir simulation is used in nearly all major reservoir development decisions. This has come about in part through technology improvements that make it easier to simulate reservoirs on one hand and possible to simulate them more realistically on the other; however, although reservoir simulation has come a long way from its beginnings in the 1950's, substantial further improvement is needed, and this is stimulating continual change in how simulation is performed. Given that this change is occurring, both developers and users of simulation have an interest in understanding where it is leading. Obviously, developers of new simulation capabilities need this understanding in order to keep their products relevant and competitive. However, people that use simulation also need this understanding; how else can they be confident that the organizations that provide their simulators are keeping up with advancing technology and moving in the right direction? In order to understand where we are going, it is helpful to know where we have been. Thus, this paper begins with a discussion of historical developments in reservoir simulation. Then it briefly describes the current state of the art in terms of how simulation is performed today. Finally, it closes with some general predictions.


Sign in / Sign up

Export Citation Format

Share Document