Drilling Performance and Data Quality Control with Live Digital Twin

2021 ◽  
Author(s):  
Roman Borisovich Karpov ◽  
Denis Yurjevich Zubkov ◽  
Aleksandr Vitalyevich Murlaev ◽  
Khaydar Bulatovich Valiullin

Abstract The paper presents a solution to the problem of qualitative determination of actual downhole loads and drilling parameters optimization performed employing a dynamic digital well model. The problem of the surface and downhole sensors data quality is disclosed, a solution for an aggregated data QAQC and achieved results are presented. The implementation of the digital platform and the functionality of the dynamic digital twin allowed us to improve the compliance with desired regimes, enabled ensuring the safety of technological operations, allowed us to speed up decision-making while drilling and well completion and commissioning into production. The digital ecosystem allows to timely respond and control operational parameters, to improve and accurately control ROP while minimizing drilling hazards risks and premature drill bit bits wear. The incorporated dynamic digital twin in real-time allows assuring data quality, analyzing activities efficiency, and defining the optimal drilling parameters. The selection of optimal drilling parameters and an increase in ROP are carried out in real-time, based on the analysis of specific mechanical energy. Quality control of sensors plays a key role in the process of evaluating effective weight to bit and associated loads, and in identifying the current friction factor values exhibited downhole. Further on performed trend analysis of the friction factors and respective changes in key drilling parameters allows to track and prevent critical overloads of the drill string, permits to determine the risks of downhole hazards, enables evaluation of well circulation and conditioning activities efficiency in a given interval – allows reducing invisible NPT and the risks of downhole complications. The introduction of a digital ecosystem and a dynamic digital twin allowed us to bring the well construction management process to the next level. Operational response and the decision-making process has been drastically accelerated and improved. Uncertainties associated with an expert's interpretation of drilling states, and subjectivity in the opinions on the effectiveness of processes were eliminated. The negative effect of the human factor and the resulting invisible nonproductive time was minimized. In a short period, the drilling contractor was able to integrate a single digital platform, improve key performance indicators, and involve the field personnel in the full cycle of the technological process of well construction. Field and office personnel, including the driller, can work in a single digital platform, and regardless of the current operation, do always know the true downhole loads, do see the allowable operating envelope and optimal values of the hook load, surface torque, SPP, flow rate, RPM, weight, and torque on the bit, ROP and tripping speeds. The presented method of assessing the quality of the readings of measuring devices and determining the true WOB allows us to optimize the technological parameters during actual drilling. The calculation of the specific mechanical energy is performed based on effective downhole loads transferred to the drill bit. An abnormal increase in the specific mechanical energy notifies the driller to promptly correct the parameters and restore the efficient drilling process. The friction factors are automatically determined during rotation off bottom and tripping operations. Safe corridors and the operational roadmap are re-evaluated every second and are dynamically updated according to the current state of the wellbore and depths.

2014 ◽  
Vol 668-669 ◽  
pp. 1374-1377 ◽  
Author(s):  
Wei Jun Wen

ETL refers to the process of data extracting, transformation and loading and is deemed as a critical step in ensuring the quality, data specification and standardization of marine environmental data. Marine data, due to their complication, field diversity and huge volume, still remain decentralized, polyphyletic and isomerous with different semantics and hence far from being able to provide effective data sources for decision making. ETL enables the construction of marine environmental data warehouse in the form of cleaning, transformation, integration, loading and periodic updating of basic marine data warehouse. The paper presents a research on rules for cleaning, transformation and integration of marine data, based on which original ETL system of marine environmental data warehouse is so designed and developed. The system further guarantees data quality and correctness in analysis and decision-making based on marine environmental data in the future.


2021 ◽  
Author(s):  
Temirlan Zhekenov ◽  
Artem Nechaev ◽  
Kamilla Chettykbayeva ◽  
Alexey Zinovyev ◽  
German Sardarov ◽  
...  

SUMMARY Researchers base their analysis on basic drilling parameters obtained during mud logging and demonstrate impressive results. However, due to limitations imposed by data quality often present during drilling, those solutions often tend to lose their stability and high levels of predictivity. In this work, the concept of hybrid modeling was introduced which allows to integrate the analytical correlations with algorithms of machine learning for obtaining stable solutions consistent from one data set to another.


Author(s):  
Suranga C. H. Geekiyanage ◽  
Dan Sui ◽  
Bernt S. Aadnoy

Drilling industry operations heavily depend on digital information. Data analysis is a process of acquiring, transforming, interpreting, modelling, displaying and storing data with an aim of extracting useful information, so that the decision-making, actions executing, events detecting and incident managing of a system can be handled in an efficient and certain manner. This paper aims to provide an approach to understand, cleanse, improve and interpret the post-well or realtime data to preserve or enhance data features, like accuracy, consistency, reliability and validity. Data quality management is a process with three major phases. Phase I is an evaluation of pre-data quality to identify data issues such as missing or incomplete data, non-standard or invalid data and redundant data etc. Phase II is an implementation of different data quality managing practices such as filtering, data assimilation, and data reconciliation to improve data accuracy and discover useful information. The third and final phase is a post-data quality evaluation, which is conducted to assure data quality and enhance the system performance. In this study, a laboratory-scale drilling rig with a control system capable of drilling is utilized for data acquisition and quality improvement. Safe and efficient performance of such control system heavily relies on quality of the data obtained while drilling and its sufficient availability. Pump pressure, top-drive rotational speed, weight on bit, drill string torque and bit depth are available measurements. The data analysis is challenged by issues such as corruption of data due to noises, time delays, missing or incomplete data and external disturbances. In order to solve such issues, different data quality improvement practices are applied for the testing. These techniques help the intelligent system to achieve better decision-making and quicker fault detection. The study from the laboratory-scale drilling rig clearly demonstrates the need for a proper data quality management process and clear understanding of signal processing methods to carry out an intelligent digitalization in oil and gas industry.


2021 ◽  
Author(s):  
Yuhan Liu ◽  
Ke Chen ◽  
Ling Ma ◽  
Shu Tang ◽  
Tan Tan
Keyword(s):  

Author(s):  
Tom August ◽  
J Terry ◽  
David Roy

The rapid rise of Artificial Intelligence (AI) methods has presented new opportunities for those who work with biodiversity data. Computer vision, in particular where computers can be trained to identify species in digital photographs, has significant potential to address a number of existing challenges in citizen science. The Biological Records Centre (www.brc.ac.uk) has been a central focus for terrestrial and freshwater citizen science in the United Kingdom for over 50 years. We will present our research on how computer vision can be embedded in citizen science, addressing three important questions. How can contextual information, such as time of year, be included in computer vision? A naturalist will use a wealth of ecological knowledge about species in combination with information about where and when the image was taken to augment their decision making; we should emulate this in our AI. How can citizen scientists be best supported by computer vision? Our ambition is not to replace identification skills with AI but to use AI to support the learning process. How can computer vision support our limited resource of expert verifiers as data volumes increase? We receive more and more data each year, which puts a greater demand on our expert verifiers, who review all records to ensure data quality. We have been exploring how computer vision can lighten this workload. How can contextual information, such as time of year, be included in computer vision? A naturalist will use a wealth of ecological knowledge about species in combination with information about where and when the image was taken to augment their decision making; we should emulate this in our AI. How can citizen scientists be best supported by computer vision? Our ambition is not to replace identification skills with AI but to use AI to support the learning process. How can computer vision support our limited resource of expert verifiers as data volumes increase? We receive more and more data each year, which puts a greater demand on our expert verifiers, who review all records to ensure data quality. We have been exploring how computer vision can lighten this workload. We will present work that addresses these questions including: developing machine learning techniques that incorporate ecological information as well as images to arrive at a species classification; co-designing an identification tool to help farmers identify flowers beneficial to wildlife; and assessing the optimal combination of computer vision and expert verification to improve our verification systems.


Author(s):  
Vasileios Orfanakis ◽  
Stamatios Papadakis ◽  
Michail Kalogiannakis ◽  
Maria Ampartzaki ◽  
Kostas Vassilakis

Today, during the ‘fourth industrial revolution’ which is led by the Internet and the digital ecosystem it creates, schools are expected to achieve the development of not only the functional skills of literacy and numeracy but also of general knowledge. The apparent inadequacy of the standardized education system to respond to the needs and interests of 21st-century students urges researchers to adopt new forms of teaching as meaningful and high-quality teaching requires a more active use of innovative educational methods and tools. With the rapid development of IT globally, there is a tendency to utilize the capabilities of e-learning as a mode of distance learning since itcan function both independently of and in conjunction with conventional teaching. The varied applications of Web 2.0 tools create new possibilities in the educational sector. It provides the ability to develop innovative educational methods that transform students from passive recipients of information to knowledge creators through an active involvement in the learning process often within a modern interactive environment. This study presents the results of the implementation of a teaching intervention, with the use of a flexible and student-centered web system developed and used as complementary to the ‘Research Project’ course during the first term of the 2015-2016 school year. The ultimate goal of this effort was to highlight and consequently incorporate the use of a digital platform for student conferences which we implemented in schools as a means to research, learning, and skill development. The students had the opportunity to participate in a digital community which employed distance learning tools for communication, cooperation, and learning during a digital conference in which they had leading roles as writers and reviewers. The initial results of the pilot study indicated that the use of the digital platform increased the interest of students, supported the development of various skills and contributed to the overall improvement of the teaching and learning process.


IoT ◽  
2021 ◽  
Vol 2 (4) ◽  
pp. 717-740
Author(s):  
Ljiljana Stojanovic ◽  
Thomas Usländer ◽  
Friedrich Volz ◽  
Christian Weißenbacher ◽  
Jens Müller ◽  
...  

The concept of digital twins (DT) has already been discussed some decades ago. Digital representations of physical assets are key components in industrial applications as they are the basis for decision making. What is new is the conceptual approach to consider DT as well-defined software entities themselves that follow the whole lifecycle of their physical counterparts from the engineering, operation up to the discharge, and hence, have their own type description, identity, and lifecycle. This paper elaborates on this idea and argues the need for systematic DT engineering and management. After a conceptual description of DT, the paper proposes a DT lifecycle model and presents methodologies and tools for DT management, also in the context of Industrie 4.0 concepts, such as the asset administration shell (AAS), the international data spaces (IDS), and IEC standards (such as OPC UA and AML). As a tool example for the support of DT engineering and management, the Fraunhofer-advanced AAS tools for digital twins (FA3ST) are presented in more detail.


Sign in / Sign up

Export Citation Format

Share Document