Demonstrating Flexibility and Cost-Efficiency of Integrated Ensemble-Based Modeling – One Approach on Three Reservoirs

2021 ◽  
Author(s):  
Abdul Saboor Khan ◽  
Salahaldeen Alqallabi ◽  
Anish Phade ◽  
Arne Skorstad ◽  
Faisal Al-Jenaibi ◽  
...  

Abstract The aim of this study is to demonstrate the value of an integrated ensemble-based modeling approach for multiple reservoirs of varying complexity. Three different carbonate reservoirs are selected with varying challenges to showcase the flexibility of the approach to subsurface teams. Modeling uncertainties are included in both static and dynamic domains and valuable insights are attained in a short reservoir modeling cycle time. Integrated workflows are established with guidance from multi-disciplinary teams to incorporate recommended static and dynamic modeling processes in parallel to overcome the modeling challenges of the individual reservoirs. Challenges such as zonal communication, presence of baffles, high permeability streaks, communication from neighboring fields, water saturation modeling uncertainties, relative permeability with hysteresis, fluid contact depth shift etc. are considered when accounting for uncertainties. All the uncertainties in sedimentology, structure and dynamic reservoir parameters are set through common dialogue and collaboration between subsurface teams to ensure that modeling best practices are adhered to. Adaptive pluri-Gaussian simulation is used for facies modeling and uncertainties are propagated in the dynamic response of the geologically plausible ensembles. These equiprobable models are then history-matched simultaneously using an ensemble-based conditioning tool to match the available observed field production data within a specified tolerance; with each reservoir ranging in number of wells, number of grid cells and production history. This approach results in a significantly reduced modeling cycle time compared to the traditional approach, regardless of the inherent complexity of the reservoir, while giving better history-matched models that are honoring the geology and correlations in input data. These models are created with only enough detail level as per the modeling objectives, leaving more time to extract insights from the ensemble of models. Uncertainties in data, from various domains, are not isolated there, but rather propagated throughout, as these might have an important role in another domain, or in the total response uncertainty. Similarly, the approach encourages a collaborative effort in reservoir modeling and fosters trust between geo-scientists and engineers, ascertaining that models remain consistent across all subsurface domains. It allows for the flexibility to incorporate modeling practices fit for individual reservoirs. Moreover, analysis of the history-matched ensemble shows added insights to the reservoirs such as the location and possible extent of features like high permeability streaks and baffles that are not explicitly modeled in the process initially. Forecast strategies further run on these ensembles of equiprobable models, capture realistic uncertainties in dynamic responses which can help make informed reservoir management decisions. The integrated ensemble-based modeling approach is successfully applied on three different reservoir cases, with different levels of complexity. The fast-tracked process from model building to decision making enabled rapid insights for all domains involved.

2021 ◽  
Author(s):  
Salahaldeen Alqallabi ◽  
Abdul Saboor Khan ◽  
Anish Phade ◽  
Mohamed Tarik Gacem ◽  
Mustapha Adli ◽  
...  

Abstract The aim of this study is to demonstrate the value of a fully integrated ensemble-based modeling approach for an onshore field in Abu Dhabi. Model uncertainties are included in both static and dynamic domains and valuable insights are achieved in record time of nine-weeks with very promising results. Workflows are established to honor the recommended static and dynamic modeling processes suited to the complexity of the field. Realistic sedimentological, structural and dynamic reservoir parameter uncertainties are identified and propagated to obtain realistic variability in the reservoir simulator response. These integrated workflows are used to generate an ensemble of equi-probable reservoir models. All realizations in the ensemble are then history-matched simultaneously before carrying out the production predictions using the entire ensemble. Analysis of the updates made during the history-matching process demonstrates valuable insights to the reservoir such as the presence of enhanced permeability streaks. These represent a challenge in the explicit modeling process due to the complex responses on the well log profiles. However, results analysis of the history matched ensemble shows that the location of high permeability updates generated by the history matching process is consistent with geological observations of enhanced permeability streaks in cores and the sequence stratigraphic framework. Additionally, post processing of available PLT data as a blind test show trends of fluid flow along horizontal wells are well captured, increasing confidence in the geologic consistency of the ensemble of models. This modeling approach provides an ensemble of history- matched reservoir models having an excellent match for both field and individual wells’ observed field production data. Furthermore, with the recommended modeling workflows, the generated models are geologically consistent and honor inherent correlations in the input data. Forecast of this ensemble of models enables realistic uncertainties in dynamic responses to be quantified, providing insights for informed reservoir management decisions and risk mitigation. Analysis of forecasted ensemble dynamic responses help evaluating performance of existing infill targets and delineate new infill targets while understanding the associated risks under both static and dynamic uncertainty. Repeatable workflows allow incorporation of new data in a robust manner and accelerates time from model building to decision making.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Nelesh Dhanpat ◽  
Chris Schachtebeck

Orientation: This research study focuses on establishing a link between job crafting and landmark studies on intrapreneurship.Research purpose: The purpose of this study was to provide a theoretical overview of intrapreneurship, intrapreneurial orientation and job crafting, and to explore theoretical linkages between these areas of enquiry.Motivation for the study: There is currently a dearth of research studies that explore the link between job crafting and intrapreneurial behaviours in existing organisations in the form of intrapreneurial orientation.Research design, approach and method: The study is presented as a conceptual paper in the form of a qualitative, theoretical study, employing a model-building approach. A deductive research approach is followed, and a narrative review methodology is employed.Main findings: The findings of this study from a literature search acknowledge the contributions of job crafting and intrapreneurial research within the management sciences, and we remain cognisant of the organisational implications of each, which have, to date, focused on the organisation, rather than the individual. With this in mind, we suggest that job crafting and intrapreneurial behaviours are empirically researched to validate the recommendations made.Practical/managerial implications: This study will help to establish the type of job-crafting interventions and job-crafting strategies needed to promote intrapreneurial behaviours in practice.Contribution/value-add: This study provides noteworthy insights, which include the suggestion that employees with a forward-looking disposition will engage in job crafting, with a focus on intrapreneurial behaviour. Furthermore, the study fills a void left in the current body of knowledge.


2021 ◽  
Author(s):  
Carlo Cristiano ◽  
◽  
Marco Pirrone ◽  

Risk-mitigation strategies are most effective when the major sources of uncertainty are determined through dedicated and in-depth studies. In the context of reservoir characterization and modeling, petrophysical uncertainty plays a significant role in the risk assessment phase, for instance in the computation of volumetrics. The conventional workflow for the propagation of the petrophysical uncertainty consists of physics-based model embedded into a Monte Carlo (MC) template. In detail, open-hole logs and their inherent uncertainties are used to estimate the important petrophysical properties (e.g. shale volume, porosity, water saturation) with uncertainty through the mechanistic model and MC simulations. In turn, model parameter uncertainties can be also considered. This standard approach can be highly time-consuming in case the physics-based model is complex, unknown, difficult to reproduce (e.g. old/legacy wells) and/or the number of wells to be processed is very high. In this respect, the aim of this paper is to show how a data-driven methodology can be used to propagate the petrophysical uncertainty in a fast and efficient way, speeding-up the complete process but still remaining consistent with the main outcomes. In detail, a fit-for-purpose Random Forest (RF) algorithm learns through experience how log measurements are related to the important petrophysical parameters. Then, a MC framework is used to infer the petrophysical uncertainty starting from the uncertainty of the input logs, still with the RF model as a driver. The complete methodology, first validated with ad-hoc synthetic case studies, has been then applied to two real cases, where the petrophysical uncertainty has been required for reservoir modeling purposes. The first one includes legacy wells intercepting a very complex lithological environment. The second case comprises a sandstone reservoir with a very high number of wells, instead. For both scenarios, the standard approach would have taken too long (several months) to be completed, with no possibility to integrate the results into the reservoir models in time. Hence, for each well the RF regressor has been trained and tested on the whole dataset available, obtaining a valid data-driven analytics model for formation evaluation. Next, 1000 scenarios of input logs have been generated via MC simulations using multivariate normal distributions. Finally, the RF regressor predicts the associated 1000 petrophysical characterization scenarios. As final outcomes of the workflow, ad-hoc statistics (e.g. P10, P50, P90 quantiles) have been used to wrap up the main findings. The complete data-driven approach took few days for both scenarios with a critical impact on the subsequent reservoir modeling activities. This study opens the possibility to quickly process a high number of wells and, in particular, it can be also used to effectively propagate the petrophysical uncertainty to legacy well data for which conventional approaches are not an option, in terms of time-efficiency.


Author(s):  
Yuan-Shyi Peter Chiu ◽  
Huei-Hsin Chang ◽  
Tiffany Chiu ◽  
Singa Wang Chiu

Variety, quality, and rapid response are becoming a trend in customer requirements in the contemporary competitive markets. Thus, an increasing number of manufacturers are frequently seeking alternatives such as redesigning their fabrication scheme and outsourcing strategy to meet the client’s expectations effectively with minimum operating costs and limited in-house capacity. Inspired by the potential benefits of delay differentiation, outsourcing, and quality assurance policies in the multi-item production planning, this study explores a single-machine two-stage multi-item batch fabrication problem considering the abovementioned features. Stage one is the fabrication of all the required common parts, and stage two is manufacturing the end products. A predetermined portion of common parts is supplied by an external contractor to reduce the uptime of stage one. Both stages have imperfect in-house production processes. The defective items produced are identified, and they are either reworked or removed to ensure the quality of the finished batch. We develop a model to depict the problem explicitly. Modeling, formulation, derivation, and optimization methods assist us in deriving a cost-minimized cycle time solution. Moreover, the proposed model can analyze and expose the diverse features of the problem to help managerial decision-making. An example of this is the individual/ collective influence of postponement, outsourcing, and quality reassurance policies on the optimal cycle time solution, utilization, uptime of each stage, total system cost, and individual cost contributors.


SPE Journal ◽  
2017 ◽  
Vol 22 (05) ◽  
pp. 1402-1415 ◽  
Author(s):  
A. H. Al Ayesh ◽  
R.. Salazar ◽  
R.. Farajzadeh ◽  
S.. Vincent-Bonnieu ◽  
W. R. Rossen

Summary Foam can divert flow from higher- to lower-permeability layers and thereby improve the injection profile in gas-injection enhanced oil recovery (EOR). This paper compares two methods of foam injection, surfactant-alternating-gas (SAG) and coinjection of gas and surfactant solution, in their abilities to improve injection profiles in heterogeneous reservoirs. We examine the effects of these two injection methods on diversion by use of fractional-flow modeling. The foam-model parameters for four sandstone formations ranging in permeability from 6 to 1,900 md presented by Kapetas et al. (2015) are used to represent a hypothetical reservoir containing four noncommunicating layers. Permeability affects both the mobility reduction of wet foam in the low-quality-foam regime and the limiting capillary pressure at which foam collapses. The effectiveness of diversion varies greatly with the injection method. In a SAG process, diversion of the first slug of gas depends on foam behavior at very-high foam quality. Mobility in the foam bank during gas injection depends on the nature of a shock front that bypasses most foam qualities usually studied in the laboratory. The foam with the lowest mobility at fixed foam quality does not necessarily give the lowest mobility in a SAG process. In particular, diversion in SAG depends on how and whether foam collapses at low water saturation; this property varies greatly among the foams reported by Kapetas et al. (2015). Moreover, diversion depends on the size of the surfactant slug received by each layer before gas injection. This favors diversion away from high-permeability layers that receive a large surfactant slug. However, there is an optimum surfactant-slug size: Too little surfactant and diversion from high-permeability layers is not effective, whereas with too much, mobility is reduced in low-permeability layers. For a SAG process, injectivity and diversion depend critically on whether foam collapses completely at irreducible water saturation. In addition, we show the diversion expected in a foam-injection process as a function of foam quality. The faster propagation of surfactant and foam in the higher-permeability layers aids in diversion, as expected. This depends on foam quality and non-Newtonian foam mobility and varies with injection time. Injectivity is extremely poor with foam injection for these extremely strong foams, but for some SAG foam processes with effective diversion it is better than injectivity in a waterflood.


2004 ◽  
Vol 22 (1) ◽  
pp. 34-39 ◽  
Author(s):  
Adrian White

Everyone wants safe medicine. The traditional approach to adverse events has developed within a culture of blaming the individual practitioner. Such an approach is likely to be damaging to individuals and possibly counterproductive by creating an atmosphere of defensiveness and denial. Industries such as airlines have developed an alternative culture using a systems approach. This approach concentrates on assessing and improving the systems of working rather than blaming an individual's performance. Frameworks have been developed for applying this approach to investigating and avoiding medical accidents. These form the basis of a check-list for acupuncture practice that is presented here, and may be useful for individuals and organisations who are concerned to reduce the risk of adverse events.


2020 ◽  
Author(s):  
Christian Huebscher ◽  
Jonas Preine

<p>Located on the Hellenic Volcanic Arc, the Christiana-Santorini-Kolumbo (CSK) marine volcanic zone is notorious for its catastrophic volcanic eruptions, earthquakes and tsunamis. Here, not only the largest volcanic eruption in human history, the so-called “Minoan” eruption took place in the Late Bronze age 3600 years ago, but also the largest 20th-century shallow earthquake in Europe of magnitude 7.4 in 1956. Although the region is heavily populated and a fully developed touristic region, the acting tectonic forces are not fully understood to this day aggravating the necessary assessment of geohazards.</p><p>Recent bathymetric and seismic studies revealed that the CSK zone comprises a system of neotectonic horst and graben structures with extended internal faulting that is thought to be the result of the ongoing extension in the southern Aegean. The NE-SW alignment of volcanic edifices within the CSK underlines the tectonic control of volcanism in this area. In this study, we show how advanced reprocessing of selected seismic lines leads to significantly improved seismic images revealing new details of the complex rift system. Moreover, using a unique diffraction-based approach for velocity model building, we perform pre-stack depth migration (PSDM) and present for the first time depth-converted seismic sections from the CSK zone. This allows for the proper estimation of fault angles, sedimentary thicknesses and performing structural restoration in order to reconstruct and measure the amount of extension in the individual rift basins. We revise the previous seismostratigraphic scheme and propose a new correlation between the horst and graben units.</p><p>Structural restoration indicates an extension of approx. 3 km along the Santorini-Anafi basin while PSDM indicates the sedimentary strata to be of maximum 1500 m thickness. According to the new stratigraphic model, we infer a four-stage evolution of this basin in which early marine deposition, syn-rift deposition, complex infill deposition and neotectonic syn-rift deposition are distinguished. Moreover, we identify negative flower structures within the basin centre indicating the presence of a strike-slip component, which superimposes the dominant NW-SE directed extension. Based on these findings, we are confident that by applying the proposed workflow to the complete regional dataset, the understanding of the relationship between tectonics and volcanism in the CSK zone will be significantly improved, and, consequently, will lead to an improved risk assessment of the central Aegean Sea.</p>


2018 ◽  
Author(s):  
Joy Merwin Monteiro ◽  
Jeremy McGibbon ◽  
Rodrigo Caballero

Abstract. sympl (System for Modelling Planets) and climt (Climate Modelling and diagnostics Toolkit) represent an attempt to rethink climate modelling frameworks from the ground up. The aim is to use expressive data structures available in the scientific Python ecosystem along with best practices in software design to build models that are self-documenting, highly inter-operable and that provide fine grained control over model components and behaviour. We believe that such an approach towards building models is essential to allow scientists to easily and reliably combine model components to represent the climate system at a desired level of complexity, and to enable users to fully understand what the model is doing. sympl is a framework which formulates the model in terms of a "state" which gets evolved forward in time by TimeStepper and Implicit components, and which can be modified by Diagnostic components. TimeStepper components in turn rely on Prognostic components to compute tendencies. Components contain all the information about the kinds of inputs they expect and outputs that they provide. Components can be used interchangeably, even when they rely on different units or array configurations. sympl provides basic functions and objects which could be used by any type of Earth system model. climt is an Earth system modelling toolkit that contains scientific components built over the sympl base objects. Components can be written in any language accessible from Python, and Fortran/C libraries are accessed via Cython. climt aims to provide different user APIs which trade-off simplicity of use against flexibility of model building, thus appealing to a wide audience. Model building, configuration and execution is through a Python script (or Jupyter Notebook), enabling researchers to build an end-to-end Python based pipeline along with popular Python based data analysis tools. Because of the modularity of the individual components, using online data analysis, visualisation or assimilation algorithms and tools with sympl/climt components is extremely simple.


Sign in / Sign up

Export Citation Format

Share Document