initial ensemble
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 7)

H-INDEX

6
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Giorgio Fighera ◽  
Ernesto Della Rossa ◽  
Patrizia Anastasi ◽  
Mohammed Amr Aly ◽  
Tiziano Diamanti

Abstract Improvements in reservoir simulation computational time thanks to GPU-based simulators and the increasing computational power of modern HPC systems, are paving the way for a massive employment of Ensemble History Matching (EHM) techniques which are intrinsically parallel. Here we present the results of a comparative study between a newly developed EHM tool that aims at leveraging the GPU parallelism, and a commercial third-party EHM software as a benchmark. Both are tested on a real case. The reservoir chosen for the comparison has a production history of 3 years with 15 wells between oil producers, and water and gas injectors. The EHM algorithm used is the Ensemble Smoother with Multiple Data Assimilations (ESMDA) and both tools have access to the same computational resources. The EHM problem was stated in the same way for both tools. The objective function considers well oil productions, water cuts, bottom-hole pressures, and gas-oil-ratios. Porosity and horizontal permeability are used as 3D grid parameters in the update algorithm, along with nine scalar parameters for anisotropy ratios, Corey exponents, and fault transmissibility multipliers. Both the presented tool and the benchmark obtained a satisfactory history match quality. The benchmark tool took around 11.2 hours to complete, while the proposed tool took only 1.5 hours. The two tools performed similar updates on the scalar parameters with only minor discrepancies. Updates on the 3D grid properties instead show significant local differences. The updated ensemble for the benchmark reached extreme values for porosity and permeability which are also distributed in a heterogeneous way. These distributions are quite unlikely in some model regions given the initial geological characterization of the reservoir. The updated ensemble for the presented tool did not reach extreme values in neither porosity nor permeability. The resulting property distributions are not so far off from the ones of the initial ensemble, therefore we can conclude that we were able to successfully update the ensemble while persevering the geological characterization of the reservoir. Analysis suggests that this discrepancy is due to the different way by which our EHM code consider inactive cells in the grid update calculations compared to the benchmark highlighting the fact that statistics including inactive cells should be carefully managed to correctly preserve the geological distribution represented in the initial ensemble. The presented EHM tool was developed from scratch to be fully parallel and to leverage on the abundantly available computational resources. Moreover, the ESMDA implementation was tweaked to improve the reservoir update by carefully managing inactive cells. A comparison against a benchmark showed that the proposed EHM tool achieved similar history match quality while improving the computation time and the geological realism of the updated ensemble.


2021 ◽  
Author(s):  
Mohammed Amr Aly ◽  
Patrizia Anastasi ◽  
Giorgio Fighera ◽  
Ernesto Della Rossa

Abstract Ensemble approaches are increasingly used for history matching also with large scale models. However, the iterative nature and the high computational resources required, demands careful and consistent parameterization of the initial ensemble of models, to avoid repeated and time-consuming attempts before an acceptable match is achieved. The objective of this work is to introduce ensemble-based data analytic techniques to validate the starting ensemble and early identify potential parameterization problems, with significant time saving. These techniques are based on the same definition of the mismatch between the initial ensemble simulation results and the historical data used by ensemble algorithms. In fact, a notion of distance among ensemble realizations can be introduced using the mismatch, opening the possibility to use statistical analytic techniques like Multi-Dimensional Scaling and Generalized Sensitivity. In this way a clear and immediate view of ensemble behavior can be quickly explored. Combining these views with advanced correlation analysis, a fast assessment of ensemble consistency with observed data and physical understanding of the reservoir is then possible. The application of the proposed methodology to real cases of ensemble history matching studies, shows that the approach is very effective in identifying if a specific initial ensemble has an adequate parameterization to start a successful computational loop of data assimilation. Insufficient variability, due to a poor capturing of the reservoir performance, can be investigated both at field and well scales by data analytics computations. The information contained in ensemble mismatches of relevant quantities like water-breakthrough and Gas-Oil-ratio is then evaluated in a systematic way. The analysis often reveals where and which uncertainties have not enough variability to explain historical data. It also allows to detect what is the role of apparently inconsistent parameters. In principle it is possible to activate the heavy iterative computation also with an initial ensemble where the analytics tools show potential difficulties and problems. However, experiences with large scale models point out that the possibility to obtain a good match in these situations is very low, leading to a time-consuming revision of the entire process. On the contrary, if the ensemble is validated, the iterative large-scale computations achieve a good calibration with a consistency that enables predictive ability. As a new interesting feature of the proposed methodology, ensemble advanced data analytics techniques are able to give clues and suggestions regarding which parameters could be source of potential history matching problems in advance. In this way it is possible anticipate directly on initial ensemble the uncertainties revision for example modifying ranges, introducing new parameters and better tuning other ensemble factors, like localization and observations tolerances that controls the ultimate match quality.


2021 ◽  
Author(s):  
Reyko Schachtschneider ◽  
Jan Saynisch-Wagner ◽  
Volker Klemann ◽  
Meike Bagge ◽  
Maik Thomas

Abstract. Glacial isostatic adjustment is largely governed by rheological properties of the Earth's mantle. Large mass redistributions in the ocean-cryosphere system and the subsequent response of the visco-elastic Earth have led to dramatic sea level changes in the past. This process is ongoing and in order to understand and predict current and future sea level changes the knowledge of mantle properties such as viscosity is essential. In this study we present a method to obtain estimates of mantle viscosities by assimilation of relative sea level data into a visco-elastic model of the lithosphere and mantle. We set up a particle filter with probabilistic resampling. In an identical twin experiment we show that mantle viscosities can be recovered in a glacial isostatic adjustment model of a simple three layer earth structure consisting of an elastic lithosphere and two mantle layers of different viscosity. In two scenarios we investigate the dependence of the ensemblebehavior on the ensemble initialization and observation uncertainties and show that the recovery is successful if the target parameter values are properly sampled by the initial ensemble probability distribution. This even includes cases in which the target viscosity values are located far in the tail of the initial ensemble probability distribution. We then successfully apply the method to two special cases that are relevant for the assimilation of real observations: 1) using observations taken from a single region only, here Laurentide and Fennoscandia, respectively, and 2) using only observations from the last 10 kyrs.


Author(s):  
Mattia Aleardi ◽  
Alessandro Vinciguerra ◽  
Azadeh Hojat

AbstractInversion of electrical resistivity tomography (ERT) data is an ill-posed problem that is usually solved through deterministic gradient-based methods. These methods guarantee a fast convergence but hinder accurate assessments of model uncertainties. On the contrary, Markov Chain Monte Carlo (MCMC) algorithms can be employed for accurate uncertainty appraisals, but they remain a formidable computational task due to the many forward model evaluations needed to converge. We present an alternative approach to ERT that not only provides a best-fitting resistivity model but also gives an estimate of the uncertainties affecting the inverse solution. More specifically, the implemented method aims to provide multiple realizations of the resistivity values in the subsurface by iteratively updating an initial ensemble of models based on the difference between the predicted and measured apparent resistivity pseudosections. The initial ensemble is generated using a geostatistical method under the assumption of log-Gaussian distributed resistivity values and a Gaussian variogram model. A finite-element code constitutes the forward operator that maps the resistivity values onto the associated apparent resistivity pseudosection. The optimization procedure is driven by the ensemble smoother with multiple data assimilation, an iterative ensemble-based algorithm that performs a Bayesian updating step at each iteration. The main advantages of the proposed approach are that it can be applied to nonlinear inverse problems, while also providing an ensemble of models from which the uncertainty on the recovered solution can be inferred. The ill-conditioning of the inversion procedure is decreased through a discrete cosine transform reparameterization of both data and model spaces. The implemented method is first validated on synthetic data and then applied to field data. We also compare the proposed method with a deterministic least-square inversion, and with an MCMC algorithm. We show that the ensemble-based inversion estimates resistivity models and associated uncertainties comparable to those yielded by a much more computationally intensive MCMC sampling.


2021 ◽  
Author(s):  
Antonio Capponi ◽  
Natalie J. Harvey ◽  
Helen F. Dacre ◽  
Keith Beven ◽  
Mike R. James

<p>Volcanic ash poses a significant hazard for aviation. If an ash cloud forms as result of an eruption, it forces a series of flight planning decisions that consider important safety and economic factors. These decisions are made using a combination of satellite retrievals and volcanic ash forecasts issued by Volcanic Ash Advisory Centres.  However, forecasts of ash hazard remain deterministic, and lack quantification of the uncertainty that arises from the estimation of eruption source parameters, meteorology and uncertainties within the dispersion model used to perform the simulations. Quantification of these uncertainties is fundamental and could be achieved by using ensemble simulations. Here, we explore how ensemble-based forecasts — performed using the Met Office dispersion model NAME — together with sequential satellite retrievals of ash column loading, may improve forecast accuracy and uncertainty characterization.</p><p>We have developed a new methodology to evaluate each member of the ensemble based on its agreement with the satellite retrievals available at the time. An initial ensemble is passed through a filter of verification metrics and compared with the first available set of satellite observations. Members far from the observations are rejected. The members within a limit of acceptability are used to resample the parameters used in the initial ensemble, and design a new ensemble to compare with the next available set of satellite observations. The filtering process and parameter resampling are applied whenever new satellite observations are available, to create new ensembles propagating forward in time, until all available observations are covered.</p><p>Although the method requires the run of many ensemble batches, and it is not yet suited for operational use, it shows how combining ensemble simulations and sequential satellite retrievals can be used to quantify confidence in ash forecasts. We demonstrate the method by applying it to the recent Raikoke (Kurii Islands, Russia) eruption, which occurred on the 22<sup>nd</sup> July 2019. Each ensemble consists of 1000 members and it is evaluated against 6-hourly HIMAWARI satellite ash retrievals.</p>


Water ◽  
2021 ◽  
Vol 13 (2) ◽  
pp. 122
Author(s):  
Juan Du ◽  
Fei Zheng ◽  
He Zhang ◽  
Jiang Zhu

Based on the multivariate empirical orthogonal function (MEOF) method, a multivariate balanced initial ensemble generation method was applied to the ensemble data assimilation scheme. The initial ensembles were generated with a reasonable consideration of the physical relationships between different model variables. The spatial distribution derived from the MEOF analysis is combined with the 3-D random perturbation to generate a balanced initial perturbation field. The Local Ensemble Transform Kalman Filter (LETKF) data assimilation scheme was established for an atmospheric general circulation model. Ensemble data assimilation experiments using different initial ensemble generation methods, spatially random and MEOF-based balanced, are performed using realistic atmospheric observations. It is shown that the ensembles integrated from the balanced initial ensembles maintain a much more reasonable spread and a more reliable horizontal correlation compared with the historical model results than those from the randomly perturbed initial ensembles. The model predictions were also improved by adopting the MEOF-based balanced initial ensembles.


2020 ◽  
Author(s):  
Milija Zupanski

<p>High-dimensional ensemble data assimilation applications require error covariance localization in order to address the problem of insufficient degrees of freedom, typically accomplished using the observation-space covariance localization. However, this creates a challenge for vertically integrated observations, such as satellite radiances, aerosol optical depth, etc., since the exact observation location in vertical does not exist. For nonlinear problems, there is an implied inconsistency in iterative minimization due to using observation-space localization which effectively prevents finding the optimal global minimizing solution. Using state-space localization, however, in principal resolves both issues associated with observation space localization.</p><p> </p><p>In this work we present a new nonlinear ensemble data assimilation method that employs covariance localization in state space and finds an optimal analysis solution. The new method resembles “modified ensembles” in the sense that ensemble size is increased in the analysis, but it differs in methodology used to create ensemble modifications, calculate the analysis error covariance, and define the initial ensemble perturbations for data assimilation cycling. From a practical point of view, the new method is considerably more efficient and potentially applicable to realistic high-dimensional data assimilation problems. A distinct characteristic of the new algorithm is that the localized error covariance and minimization are global, i.e. explicitly defined over all state points. The presentation will focus on examining feasible options for estimating the analysis error covariance and for defining the initial ensemble perturbations.</p>


2018 ◽  
Vol 25 (4) ◽  
pp. 731-746 ◽  
Author(s):  
Sangeetika Ruchi ◽  
Svetlana Dubinkina

Abstract. Over the years data assimilation methods have been developed to obtain estimations of uncertain model parameters by taking into account a few observations of a model state. The most reliable Markov chain Monte Carlo (MCMC) methods are computationally expensive. Sequential ensemble methods such as ensemble Kalman filters and particle filters provide a favorable alternative. However, ensemble Kalman filter has an assumption of Gaussianity. Ensemble transform particle filter does not have this assumption and has proven to be highly beneficial for an initial condition estimation and a small number of parameter estimations in chaotic dynamical systems with non-Gaussian distributions. In this paper we employ ensemble transform particle filter (ETPF) and ensemble transform Kalman filter (ETKF) for parameter estimation in nonlinear problems with 1, 5, and 2500 uncertain parameters and compare them to importance sampling (IS). The large number of uncertain parameters is of particular interest for subsurface reservoir modeling as it allows us to parameterize permeability on the grid. We prove that the updated parameters obtained by ETPF lie within the range of an initial ensemble, which is not the case for ETKF. We examine the performance of ETPF and ETKF in a twin experiment setup, where observations of pressure are synthetically created based on the known values of parameters. For a small number of uncertain parameters (one and five) ETPF performs comparably to ETKF in terms of the mean estimation. For a large number of uncertain parameters (2500) ETKF is robust with respect to the initial ensemble, while ETPF is sensitive due to sampling error. Moreover, for the high-dimensional test problem ETPF gives an increase in the root mean square error after data assimilation is performed. This is resolved by applying distance-based localization, which however deteriorates a posterior estimation of the leading mode by largely increasing the variance due to a combination of less varying localized weights, not keeping the imposed bounds on the modes via the Karhunen–Loeve expansion, and the main variability explained by the leading mode. A possible remedy is instead of applying localization to use only leading modes that are well estimated by ETPF, which demands knowledge of which mode to truncate.


2018 ◽  
Author(s):  
Sangeetika Ruchi ◽  
Svetlana Dubinkina

Abstract. Over the years data assimilation methods have been developed to obtain estimations of uncertain model parameters by taking into account a few observations of a model state. However, most of these computationally affordable methods have assumptions of Gaussianity, e.g. an Ensemble Kalman Filter. Ensemble Transform Particle Filter does not have the assumption of Gaussianity and has proven to be highly beneficial for an initial condition estimation and a small number of parameter estimation in chaotic dynamical systems with non-Gaussian distributions. In this paper we employ Ensemble Transform Particle Smoother (ETPS) and Ensemble Transform Kalman Smoother (ETKS) for parameter estimation in nonlinear problems with 1, 5, and 2500 uncertain parameters and compare them to importance sampling (IS). We prove that the updated parameters obtained by ETPS lie within the range of an initial ensemble, which is not the case for ETKS. We examine the performance of ETPS and ETKS in a twin experiment setup, where observations of pressure are synthetically created based on the know values of parameters. The numerical experiments demonstrate that the ETKS provides good estimations of the mean parameters but not of the posterior distributions and as the ensemble size increases the posterior does not improve. ETPS provides good approximations of the posterior and as the ensemble size increases the posterior converges to the posterior obtained by IS with a large ensemble. ETKS is very robust while ETPS is very sensitive with respect to the initial ensemble. An issue of an increase in the root mean square error after data assimilation is performed in ETPS for a high-dimensional test problem is resolved by applying distance-based localization, which however deteriorated the posterior estimation.


2017 ◽  
Vol 139 (6) ◽  
Author(s):  
Hyungsik Jung ◽  
Honggeun Jo ◽  
Kyungbook Lee ◽  
Jonggeun Choe

Ensemble Kalman filter (EnKF) uses recursive updates for data assimilation and provides dependable uncertainty quantification. However, it requires high computing cost. On the contrary, ensemble smoother (ES) assimilates all available data simultaneously. It is simple and fast, but prone to showing two key limitations: overshooting and filter divergence. Since channel fields have non-Gaussian distributions, it is challenging to characterize them with conventional ensemble based history matching methods. In many cases, a large number of models should be employed to characterize channel fields, even if it is quite inefficient. This paper presents two novel schemes for characterizing various channel reservoirs. One is a new ensemble ranking method named initial ensemble selection scheme (IESS), which selects ensemble members based on relative errors of well oil production rates (WOPR). The other is covariance localization in ES, which uses drainage area as a localization function. The proposed method integrates these two schemes. IESS sorts initial models for ES and these selected are also utilized to calculate a localization function of ES for fast and reliable channel characterization. For comparison, four different channel fields are analyzed. A standard EnKF even using 400 models shows too large uncertainties and updated permeability fields lose channel continuity. However, the proposed method, ES with covariance localization assisted by IESS, characterizes channel fields reliably by utilizing good 50 models selected. It provides suitable uncertainty ranges with correct channel trends. In addition, the simulation time of the proposed method is only about 19% of the time required for the standard EnKF.


Sign in / Sign up

Export Citation Format

Share Document