downhole sensors
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 7)

H-INDEX

4
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Bogi Haryo Nugroho ◽  
Brahmantiyo Aji Sumarto ◽  
Muhammad Arief Joenaedy ◽  
Huda Jassim Al-Aradi ◽  
Pajar Rahman Achmad

Abstract Objective/scope It has been a challenge to analyze and estimate reliable water cut. The current well test data is not sufficient to satisfy the required information for prediction of the rate and water cut behaviors. Only on wells having stable and good behaviors, water cut levels can be estimated appropriately. The wells have Electrical Submersible Pump (ESP) sensor reading and data acquisition recorded in real-time help to fill this gap. The data are stored and available in KOC data repositories, such as Corporate Database, Well Surveillance Management System (WSMS), and Artificial Lift Management System (ALMS) Engineers spend this effort in spreadsheets and working with multiple data repositories. It is fit for data analysis by combining the data into a simple data set and presentation. Nevertheless, spreadsheets do not address a number of important tasks in a typical analyst's pipeline, and their design frequently complicates the analyses. It may take hours for single well analysis and days for multi-wells analysis and could be too late to plan and take preventive actions. Concerning the above situation, collaboration has been performed between NFD-North Kuwait and Information Management Team. In this first phase, this initiative is to design a conceptual integrated preventive system, which provide easy and quick tool to compute water cut estimation from well tests and downhole sensors data by using data science approach. Method, procedure, process There are 5 steps were applied in this initial work. It was included but not limited to user interview, exercise and performed data dissemination. It included gather full knowledge and defining the goal. Mapping pain points to solution also conducted to identify the technical challenge and find ways to overcome them. In the end of this stage, data and process review was conducted and applied for a given simple example to understand the requirements, demonstrate technical functionality and verify technical feasibility. Then conceptual design was built based on the requirements, features, and solutions gathered. Integrated system solution was recommended to include intermediate layer for integration, data retrieval, running calculation-heavy process in background, model optimization, visual analytics, decision-making, and automation. A roadmap with complete planning of different phases is then provided to achieve the objective. Results, observations, conclusions Process, functionalities, requirements, and finding have been examined and elaborated. The conceptual design has proved and assured the utilization of ESP sensor data in helping to estimate continuous well water cut's behavior. Further, the next implementation phase of data science expects an increase of confidence level of the results into higher degree. The design is promising to achieve the requirement to provide seamless, scalable, and easy to deploy automation capability tools for data analytic workflow with several major business benefits arising. Proposed solution includes combination of technologies, implementation services, and project management. The proposed technology components are distributed into 3 layers, source data, data science layer, and visual analytics layer. Furthermore, a roadmap of the project along with the recommendation for each phase has also been included. Novel/additive information Data Science for Exploration and Production is new area in which research and development will be required. Data science driven approach and application of digital transformation enables an integrated preventive system providing solution to compute water cut estimation from well tests and downhole sensors data. In the next larger scale of implementation, this system is expected to provide automated workflow supporting engineers in their daily tasks leveraging Data to Decision (D2D) approach. Machine learning is a data analytics technique that teaches computers to do what comes naturally to human, which is learn from experience. Machine learning algorithm use computational methods to learn information from the data without relying on predetermined equation as a model. Adding artificial intelligence and machine learning capability into the process requires knowledge on input data, the impact of data on the output, understanding of machine learning algorithm and building the model required to meet the expected output.


2021 ◽  
Author(s):  
Livia Zihlmann ◽  
Mike Parker ◽  
Luke Malsam

Abstract Downhole sensors gather vital data for the health of an ESP system. Not only do the sensor readings help indicate the flow pattern; they also help indicate further issues such as plugging and degradation of the ESP system. Once a system has grounded on a single phase, sensor readings are lost, and operators must rely on current and frequency for the system to operate efficiently. In unconventional applications of ESP, operators see a small difference between no load, no flow and gas locking conditions. This small difference is due to the de-rating of motors used in order to get the fluid to surface in the severe applications. When the sensor readings typically are lost, operators are no longer able to accurately diagnose the reason for a shutdown. Adding the Tubing Temperature Transducers (TTT's) helps regain an indication of motor temperature along with load on the system. When operators have a drop in the tubing temperature this indicates the system is not able to get as much fluid to surface either indicating gas locking or a no-load condition which results in heating of the downhole system, particularly the motor. All these possible scenarios cause degradation of the ESP equipment and can cause pre-mature failure. If the system is set up with TTT's operators can shut-in the well to avoid extended periods of excessive heating caused by either gas locking or no flow conditions. Single phase to ground conditions occur frequently, however this paper does not address the root cause of a single-phase grounds, rather it addresses what the operator can do to operate efficiently when a unit has grounded out a single phase.


2021 ◽  
Author(s):  
Jimmy Price ◽  
Chris Jones ◽  
Bin Dai ◽  
Darren Gascooke ◽  
Michael Myrick

Abstract Digital fluid sampling is a technique utilizing downhole sensors to measure formation fluid properties without collecting a physical sample. Unfortunately, sensors are prone to drift over time due to the harsh downhole environmental conditions. Therefore, constant sensor evaluation and calibration is required to ensure the quality of analysis. A new technique utilizes a virtual sensor as a digital twin which provides a calibration that can be utilized by the physical twin. Digital twin technology enables the end-user to operate and collaborate remotely, rapidly simulate different scenarios, and provide improved accuracy via enhanced up-to-date calibrations. With respect to downhole fluid identification, the contribution of harsh environmental conditions and sensor drift can also be mitigated by realizing a virtual implementation of the fluid behavior and the individual sensor components. Historically, the virtual behavior of a digital twin has been constructed by a combination of complex multi-physics and empirical modeling. More recently, access to large datasets and historical results has enabled the use of machine learning neural networks to successfully create digital twin sensors. In this paper, we explore the efficacy of constructing a digital twin on a single downhole optical fluid identification sensor using both the machine learning nonlinear neural network and the complex, multi-physics' based modeling approaches. Advantages and lessons to be learned from each individual method will be discussed in detail. In doing so, we have found a hybrid approach to be most effective in constraining the problem and preventing over-fitting while also yielding a more accurate calibration. In addition, the new hybrid digital twin evaluation and calibration method is extended to encompass an entire fleet of similar downhole sensors simultaneously. The introduction of digital twin technology is not new to the petroleum industry. Yet there is significant room for improvement in order to identify how the technology can be implemented best in order to decrease costs and improve reliability. This paper looks at two separate methods that scientists and engineers employ to enable digital twin technology and ultimately identify that a hybrid approach between machine learning and empirical physics'-based modeling prevails.


2021 ◽  
Author(s):  
Kelly Scott Sims ◽  
John Abhishek Bomidi ◽  
William Anthony Moss ◽  
Thomas Andrew Wilson

Abstract With the ever-increasing pressure to drill wells efficiently at lower costs, the utilization of downhole sensors in the Bottom Hole Assembly (BHA) that reveal true downhole dynamics has become scarce. Surface sensors are notoriously inaccurate in translating readings to an accurate representation of downhole dynamics. The issue of 1 to 1 interpretation of surface to downhole dynamics is prevalent in all sensors and creates a paradigm of inefficient drilling practices and decision making. Intelligent mapping of downhole dynamics (IMoDD) is an analytical suite to address these inefficiencies and maximize the use of surface sensors, thus doing more with less. IMoDD features a new zeroing beyond the traditional workflows of zeroing the surface sensors related to weight and torque at the connection. A new method, Second-order Identifier of Maximum Stand-pipe-pressure: SIMS, is introduced. The method examines changes in stand-pipe pressure and identifies the point before bit-wellbore contact, using a set of conditions. The resulting calculations of weight and torque are verified with measured values of downhole weight and torque, for multiple stands of drilling in vertical, curve-lateral drilling. After the new zero, the deviation of torque-weight correlations is further examined to reveal the downhole weight changes confirmed also by the downhole sensor data. It is demonstrated that an intelligent mapping system that improves downhole characterizations would improve decision making to facilitate smoother energy transfer thus reducing Non-Productive Time (NPT) and increasing BHA life span.


2020 ◽  
pp. 1-10
Author(s):  
Emmanuel Akita ◽  
Forrest Dyer ◽  
Savanna Drummond ◽  
Monica Elkins ◽  
Payton Duggan ◽  
...  

Summary The use of drilling automation is accelerating, mostly in the area of rate of penetration (ROP) enhancement. Autonomous directional drilling is now a high focus area for automating drilling operations. The potential impact is immense because 93% of the active rigs in the US are drilling directional or horizontal wells. The 2018–2019 Drilling Systems Automation Technical Section (DSATS)-led international Drillbotics® Student Competition includes automated directional drilling. In this paper, we discuss the detailed design of the winning team. We present the surface equipment, downhole tools, data and control systems, and lessons learned. SPE DSATS organizes the annual Drillbotics competition for university teams to design and develop laboratory-scale drilling rigs. The competition requires each team to create unique downhole sensors to allow automated navigation to drill a directional hole. Student teams have developed new rig configurations to enable several steering methods that include a rotary steering system and small-scale downhole motors with a bent-sub. The most significant challenge was creating a functional downhole motor to fit within a 1.25-in. (3.18 cm) diameter wellbore. Besides technical issues, teams must demonstrate what they have learned about bit-rock interaction and the physics of steering. In addition, they must deal with budgets and funding, procurement and delivery delays, and overall project management. This required an integrated multidisciplinary approach and a major redesign of the rig components. The University of Oklahoma (OU) team made significant changes to its existing rig to drill directional holes. The design change was introduced to optimize the performance of the bottomhole assembly (BHA) and allow directional drilling. The criteria for selecting the BHA was hole size, BHA dynamics, a favorable condition for downhole sensors, precise control of drilling parameters, rig mobility, safety, time constraints, and economic practicality. The result is an autonomous drilling rig that drills a deviated hole toward a defined target through a 2 × 2 × 1-ft (60.96 × 60.96 × 30.48 cm) sandstone block (i.e., rock sample) without human intervention. The rig currently uses a combination of discrete and dynamic modeling from experimentally determined control parameters and closed-loop feedback for well-trajectorycontrol. The novelty of our winning design is in the use of a small-scale cable-driven downhole motor with a bent-sub and quick-connect-type swivel system. This is intended to replicate the action of a mud motor within the limits of the borehole diameter. In this paper, we present details of the rig components, their specifications, and the problems faced during the design, development, and testing. We demonstrate how a laboratory-scale rig can be used to study drilling dysfunctions and challenges. Building a downhole tool to withstand vibrations, water intrusion, magnetic interference, and electromagnetic noise are common difficulties faced by major equipment manufacturers.


2019 ◽  
Vol 35 (4) ◽  
pp. 2003-2015 ◽  
Author(s):  
Kioumars Afshari ◽  
Jonathan P. Stewart ◽  
Jamison H. Steidl

We present a data set of ground motion recordings and site information from vertical array sites in California. The recordings include two horizontal components of ground shaking at the ground surface level and from downhole sensors. The availability of both surface and downhole recordings at the same site facilitates direct observations of site response. The site data include measured shear-and compression-wave velocities, and, where available, geotechnical boring logs. We considered 39 vertical array sites in California and chose 21 for inclusion in the database on the basis of having at least four pairs of surface/downhole recordings. The recordings and site data are presented in a data repository, which is accessible at the DesignSafe platform (DOI: 10.17603/146DS2N680). The original digital accelerograms are processed in a manner consistent with NGA-West2 protocols. In this paper, this data set is compared to a similar but larger data set from Japanese vertical arrays compiled by others.


2019 ◽  
Author(s):  
Molly Giltner ◽  
Linsay Earle ◽  
John Willis ◽  
Diego Tellez ◽  
Randall Neel

2016 ◽  
Vol 9 (1) ◽  
pp. 55-71
Author(s):  
Jean-Pierre Deflandre

This paper aims at presenting what induced microseismicity is and how it is useful to produce source rock and tight formations. We use our 30-year experience in the field to discuss on data acquisition, processing and interpretation issues. In particular, we establish the difference between hydraulic fracture mapping and long-term monitoring of reservoir mechanical behavior. We comment on advantages and drawbacks of the different monitoring scenarios -from surface acquisition to the use of downhole sensors- while discussing location issues. We illustrate the interest of working on the raw data in order to benefit from valuable information contained into the signal signature. We also refer to examples from the literature to discuss induced seismicity associated with shale play production showing solicitation of conjugate fracture networks or re-activation of faults. Using the north American experience, we introduce the recent debate on anthropogenic seismicity referring to what is currently observed in Oklahoma (US) and western Canada.


2016 ◽  
Author(s):  
Theresa Baumgartner ◽  
Yang Zhou ◽  
Eric van Oort

Sign in / Sign up

Export Citation Format

Share Document