scholarly journals Simple methods for quick determination of aquifer parameters using slug tests

2016 ◽  
Vol 48 (2) ◽  
pp. 326-339 ◽  
Author(s):  
A. Ufuk Şahin

The slug test is still one of the simplest and cost-effective methods to interpret the hydraulic parameters for aquifer analysis. This study introduces two new estimation approaches for the slug test, the time shift method (TSM) and arc-length matching method (AMM), to identify aquifer parameters in a reliable and accurate manner, which was established on the idea that any change in the normalized drawdown or arc-length measurements of the data curve at the predefined drawdown levels is linked with the variation of storativity. These approaches remove the need for superimposition of the type curves and the field data. The proposed methods are straightforward to apply and automatize the parameter estimation process. TSM and AMM were tested with a number of numerical experiments including synthetically generated data augmented with random noise, hypothetical slug tests conducted in a heterogeneous rock-fracture system, and well-known real field data. The skin effect was also implemented to evaluate its impact on the estimation performance of the suggested approaches. The results verified that both proposed methods are able to produce estimates of hydraulic parameters more accurately than existing methods. The proposed methods could serve as a viable supplementary interpretation tool for slug test analysis.

Fuels ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 286-303
Author(s):  
Vuong Van Pham ◽  
Ebrahim Fathi ◽  
Fatemeh Belyadi

The success of machine learning (ML) techniques implemented in different industries heavily rely on operator expertise and domain knowledge, which is used in manually choosing an algorithm and setting up the specific algorithm parameters for a problem. Due to the manual nature of model selection and parameter tuning, it is impossible to quantify or evaluate the quality of this manual process, which in turn limits the ability to perform comparison studies between different algorithms. In this study, we propose a new hybrid approach for developing machine learning workflows to help automated algorithm selection and hyperparameter optimization. The proposed approach provides a robust, reproducible, and unbiased workflow that can be quantified and validated using different scoring metrics. We have used the most common workflows implemented in the application of artificial intelligence (AI) and ML in engineering problems including grid/random search, Bayesian search and optimization, genetic programming, and compared that with our new hybrid approach that includes the integration of Tree-based Pipeline Optimization Tool (TPOT) and Bayesian optimization. The performance of each workflow is quantified using different scoring metrics such as Pearson correlation (i.e., R2 correlation) and Mean Square Error (i.e., MSE). For this purpose, actual field data obtained from 1567 gas wells in Marcellus Shale, with 121 features from reservoir, drilling, completion, stimulation, and operation is tested using different proposed workflows. A proposed new hybrid workflow is then used to evaluate the type well used for evaluation of Marcellus shale gas production. In conclusion, our automated hybrid approach showed significant improvement in comparison to other proposed workflows using both scoring matrices. The new hybrid approach provides a practical tool that supports the automated model and hyperparameter selection, which is tested using real field data that can be implemented in solving different engineering problems using artificial intelligence and machine learning. The new hybrid model is tested in a real field and compared with conventional type wells developed by field engineers. It is found that the type well of the field is very close to P50 predictions of the field, which shows great success in the completion design of the field performed by field engineers. It also shows that the field average production could have been improved by 8% if shorter cluster spacing and higher proppant loading per cluster were used during the frac jobs.


2021 ◽  
Vol 11 (11) ◽  
pp. 5025
Author(s):  
David González-Peña ◽  
Ignacio García-Ruiz ◽  
Montserrat Díez-Mediavilla ◽  
Mª. Isabel Dieste-Velasco ◽  
Cristina Alonso-Tristán

Prediction of energy production is crucial for the design and installation of PV plants. In this study, five free and commercial software tools to predict photovoltaic energy production are evaluated: RETScreen, Solar Advisor Model (SAM), PVGIS, PVSyst, and PV*SOL. The evaluation involves a comparison of monthly and annually predicted data on energy supplied to the national grid with real field data collected from three real PV plants. All the systems, located in Castile and Leon (Spain), have three different tilting systems: fixed mounting, horizontal-axis tracking, and dual-axis tracking. The last 12 years of operating data, from 2008 to 2020, are used in the evaluation. Although the commercial software tools were easier to use and their installations could be described in detail, their results were not appreciably superior. In annual global terms, the results hid poor estimations throughout the year, where overestimations were compensated by underestimated results. This fact was reflected in the monthly results: the software yielded overestimates during the colder months, while the models showed better estimates during the warmer months. In most studies, the deviation was below 10% when the annual results were analyzed. The accuracy of the software was also reduced when the complexity of the dual-axis solar tracking systems replaced the fixed installation.


Author(s):  
Amitabh Kumar ◽  
Brian McShane ◽  
Mark McQueen

A large Oil and Gas pipeline gathering system is commonly used to transport processed oil and gas from an offshore platform to an onshore receiving facility. High reliability and integrity for continuous operation of these systems is crucial to ensure constant supply of hydrocarbon to the onshore processing facility and eventually to market. When such a system is exposed to a series of complex environmental loadings, it is often difficult to predict the response path, in-situ condition and therefore the system’s ability to withstand subsequent future loading scenarios. In order to continue to operate the pipeline after a significant environmental event, an overall approach needs to be developed to — (a) Understand the system loading and the associated integrity, (b) Develop a series of criteria staging the sequence of actions following an event that will verify the pipeline integrity and (c) Ensure that the integrity management solution is simple and easy to understand so that it can be implemented consistently. For a complex loading scenario, one of the main challenges is the ability to predict the controlling parameter(s) that drives the global integrity of these systems. In such scenarios, the presence of numerous parameters makes the technical modeling and prediction tasks arduous. To address such scenarios, first and foremost, it is crucial to understand the baseline environment data and other associated critical design input elements. If the “design environmental baseline” has transformed (due to large events e.g. storms etc.) from its original condition; it modifies the dynamics of the system. To address this problem, a thorough modeling and assessment of the in-situ condition is essential. Further, a robust calibration method is required to predict the future response path and therefore expected pipeline condition. The study further compares the planned integrity management solutions to the field data to validate the efficiency of the predicted scenarios. By the inclusion of real field-data feedback to the modeling method, balanced integrity solutions can be achieved and the ability to quantify the risks is made more practical and actionable.


Author(s):  
Todor Ganchev ◽  
Iosif Mporas ◽  
Olaf Jahn ◽  
Klaus Riede ◽  
Karl-L. Schuchmann ◽  
...  

2011 ◽  
pp. 160-187
Author(s):  
Liaquat Hossain ◽  
Mohammad A. Rashid ◽  
Jon David Patrick

Anticipating the use of ERP systems among small to medium enterprises (SMEs) to be the future area of growth, ERP vendors such as SAP, Oracle, PeopleSoft, J.D. Edwards and Bann are introducing ERP software that appeals to the market segment of the SMEs. Introduction of the ERP systems for SMEs includes compact packages, flexible pricing policies, new implementation methodologies, and more specialized functionalities. The strengths-weaknesses-opportunities-threats (SWOT) framework of the ERP software offered by the aforementioned vendors for the SMEs requires in-depth analysis based on real field data. The aim of this study is to identify the strengths, weaknesses, opportunities, and threats of ERP systems offered by the five leading vendors for the SMEs in Australia. Multiple case study design approach is used here for collecting the primary data from the ERP vendors. A SWOT framework is developed to study the functionality of the ERP systems offered by these vendors. This framework may guide the managers of SMEs in selecting and implementing ERP systems for their organizations.


2011 ◽  
pp. 182-208
Author(s):  
Liaquat Hossain ◽  
Mohammad A. Rashid ◽  
Jon David Patrick

Anticipating the use of the ERP systems among small-to-medium enterprises (SMEs) to be the future area of growth ERP vendors such as SAP, Oracle, PeopleSoft, JDEdwards and Bann are introducing ERP software that appeal to the market segment of the SMEs. Introduction of the ERP systems for SMEs includes compact packages, flexible pricing policies, new implementation methodologies, and more specialized functionalities. The strengths-weakness-opportunity-threats (SWOT) framework of the ERP software offered by the aforementioned vendors for the SMEs requires in-depth analysis based on real-field data. The aim of this study is to identify the strengths, weaknesses, opportunities, and threats of ERP systems offered by the five leading vendors for the SMEs in Australia. Multiple case study design approach is used here for collecting the primary data from the ERP vendors. A SWOT framework is developed to study the functionality of the ERP systems offered by these vendors. This framework may guide the managers of SMEs in selecting and implementing ERP systems for their organizations.


Geophysics ◽  
2020 ◽  
Vol 85 (4) ◽  
pp. KS127-KS138 ◽  
Author(s):  
Yujin Liu ◽  
Yue Ma ◽  
Yi Luo

Locating microseismic source positions using seismic energy emitted from hydraulic fracturing is essential for choosing optimal fracking parameters and maximizing the fracturing effects in hydrocarbon exploitation. Interferometric crosscorrelation migration (ICCM) and zero-lag autocorrelation of time-reversal imaging (ATRI) are two important passive seismic source locating approaches that are proposed independently and seem to be substantially different. We have proven that these two methods are theoretically identical and produce very similar images. Moreover, we have developed cross-coherence that uses normalization by the spectral amplitude of each of the traces, rather than crosscorrelation or deconvolution, to improve the ICCM and ATRI methods. The adopted method enhances the spatial resolution of the source images and is particularly effective in the presence of highly variable and strong additive random noise. Synthetic and field data tests verify the equivalence of the conventional ICCM and ATRI and the equivalence of their improved versions. Compared with crosscorrelation- and deconvolution-based source locating methods, our approach shows a high-resolution property and antinoise capability in numerical tests using synthetic data with single and multiple sources, as well as field data.


Geophysics ◽  
2011 ◽  
Vol 76 (3) ◽  
pp. L1-L10 ◽  
Author(s):  
Majid Beiki ◽  
Laust B. Pedersen ◽  
Hediyeh Nazi

This study has shown that the same properties of the gravity gradient tensor are valid for the pseudogravity gradient tensor derived from magnetic field data, assuming that the magnetization direction is known. Eigenvectors of the pseudogravity gradient tensor are used to estimate depth to the center of mass of geologic bodies. The strike directions of 2D geological structures are estimated from the eigenvectors corresponding to the smallest eigenvalues. For a set of data points enclosed by a square window, a robust least-squares procedure is used to estimate the source point which has the smallest sum of squared distances to the lines passing through the measurement points and parallel to the eigenvectors corresponding to the maximum eigenvalues. The dimensionality of the pseudogravity field is defined from the dimensionality indicator I, derived from the tensor components. In the case of quasi-2D sources, a rectangular window is used in the robust least-squares procedure to reduce the uncertainty of estimations.Based on synthetic data sets, the method was tested on synthetic models and found to be robust to random noise in magnetic field data. The application of the method was also tested on a pseudogravity gradient tensor derived from total magnetic field data over the Särna area in west-central Sweden. Combined with Euler deconvolution, the method provides useful complementary information for interpretation of aeromagnetic data.


Sign in / Sign up

Export Citation Format

Share Document