scholarly journals New technologies for well testing and hydrodynamic well study were appliedfor the first time

2021 ◽  
Vol 6 (1) ◽  
pp. 15-22
Author(s):  
A. M. Gryzlov ◽  
S. A. Barylnik ◽  
V. V. Litvin ◽  
K. L. Ponitevskiy ◽  
S. V. Esipov

This article is concerned with the application experience of cutting-edge technology for surface and downhole well testing equipment which is used while well test operations and hydrodynamic well study. The results of operations and equipment performance are presented as well as recommendations on technology application at offshore and onshore projects.

2014 ◽  
Vol 17 (04) ◽  
pp. 449-456 ◽  
Author(s):  
Pierre-David Maizeret ◽  
David Reid ◽  
Bertrand Theuveny

Summary Separators have a proven track record and are widely used in well-testing operations. However, their range of applications is relatively narrow, and they can encounter limitations with fluids for which separation quality is an issue. This paper describes a well test in a deepwater well offshore Brazil. To be able to accommodate different production scenarios, two separators and a multiphase meter were used to measure the flow rates. During the test, a comparison of the flow rates from the multiphase flowmeter and those reported by the separator made possible the identification of some carry-over in the separator at a high choke setting. When the choke size was increased, the separator gas rate increased, whereas the liquid rate dropped at the separator, resulting in a very low condensate/gas ratio (CGR). The multiphase meter, on the other hand, reported a constant CGR, allowing for the real-time diagnostic that carry-over was occurring at the separator. At the end of the test, the same increased choke setting was used for a short period of time to confirm the behavior. The same rate results were observed for each phase, and a small amount of liquid could be seen in the gas flare. The final proof came from the analysis of the downhole samples, which confirmed the CGR measured with the multiphase meter. The consistent results from the multiphase meter make the meter ideal to validate the flow-rate measurements from the reservoir and to improve the burning efficiency. By allowing early identification of imperfect separation, the multiphase meter reduces the health, safety, and environmental (HSE) risks associated with well-test operations.


2019 ◽  
Vol 36 (2) ◽  
pp. 21-22
Author(s):  
Ray Harper

Purpose The purpose of this paper is to summarise a number of presentations at Day 1 of the Internet Librarian International conference, London, UK (16 October 2018). This was the 20th conference in the series, and the three key themes included were the next-gen library and librarian; understanding users, usage and user experience; and inclusion and inspiration: libraries making a difference. Design/methodology/approach This paper reports from the viewpoint of a first-time attendee of the conference. This summarises the main issues raised by each presentation and draws out the key learning points for practical situations. Findings The conference covered a variety of practical ways in which libraries can use technology to support users and make decisions about services. These include developing interactive physical spaces which include augmented reality; introducing “chat-bots” to support users; using new techniques to analyse data; and piloting new ways to engage users (such as coding clubs). A key theme was how we use and harness data in a way that is ethical, effective and relevant to library services. Originality/value This conference focussed on practical examples of how library and information services across sectors and countries are innovating in a period of huge change. The conference gave delegates numerous useful ideas and examples of best practice and demonstrated the strength of the profession in adapting to new technologies and developments.


Pharmaceutics ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 1051
Author(s):  
Jonattan Gallegos-Catalán ◽  
Zachary Warnken ◽  
Tania F. Bahamondez-Canas ◽  
Daniel Moraga-Espinoza

Orally inhaled drug products (OIDPs) are an important group of medicines traditionally used to treat pulmonary diseases. Over the past decade, this trend has broadened, increasing their use in other conditions such as diabetes, expanding the interest in this administration route. Thus, the bioequivalence of OIDPs is more important than ever, aiming to increase access to affordable, safe and effective medicines, which translates into better public health policies. However, regulatory agencies leading the bioequivalence process are still deciding the best approach for ensuring a proposed inhalable product is bioequivalent. This lack of agreement translates into less cost-effective strategies to determine bioequivalence, discouraging innovation in this field. The Next-Generation Impactor (NGI) is an example of the slow pace at which the inhalation field evolves. The NGI was officially implemented in 2003, being the last equipment innovation for OIDP characterization. Even though it was a breakthrough in the field, it did not solve other deficiencies of the BE process such as dissolution rate analysis on physiologically relevant conditions, being the last attempt of transferring technology into the field. This review aims to reveal the steps required for innovation in the regulations defining the bioequivalence of OIDPs, elucidating the pitfalls of implementing new technologies in the current standards. To do so, we collected the opinion of experts from the literature to explain these trends, showing, for the first time, the stakeholders of the OIDP market. This review analyzes the stakeholders involved in the development, improvement and implementation of methodologies that can help assess bioequivalence between OIDPs. Additionally, it presents a list of methods potentially useful to overcome some of the current limitations of the bioequivalence standard methodologies. Finally, we review one of the most revolutionary approaches, the inhaled Biopharmaceutical Classification System (IBCs), which can help establish priorities and order in both the innovation process and in regulations for OIDPs.


2021 ◽  
Vol 91 (13-14) ◽  
pp. 1609-1626
Author(s):  
Yuran Jin ◽  
Xiangye Song ◽  
Jinhuan Tang ◽  
Xiaodong Dong ◽  
Huisheng Ji

The research on the business model of garment enterprises (BMGE) has expanded rapidly in the last decade. However, there is still a lack of comprehensive reviews of it, let alone visual research. Based on scientometrics, in this paper 118 papers and their 4803 references from Science Citation Index Expanded, Social Sciences Citation Index, Conference Proceedings Citation Index—Science, and Conference Proceedings Citation Index—Social Science & Humanities for the period 2010–2020 about the BMGE were analyzed by visualizing the co-cited references, co-occurrence keywords, burst references, dual-map overlays, and more with CiteSpace, Google Maps, and VOSviewer. The research revealed the intellectual landscapes of the BMGE for the first time and mapped the landmark papers, hotspots and trends, national or regional distributions and their cooperation networks, highly cited authors, and prestigious journals and disciplines related to the BMGE. The results show that the biggest hotspot is the fast fashion business model; social responsibility, smart fashion, Internet of Things, and sharing fashion are the main emerging hotspots; and the research focuses has evolved from traditional business models to business models driven by new technologies, then to new issues such as circular economy models. The institutions are mainly distributed in China, the United States, and Western Europe, and there is cooperation between more than 11 countries. The most popular disciplines are economics and politics, while psychology, education, and social science are the essential basic disciplines. The Journal of Cleaner Production and Journal of Fashion Marketing and Management, among others, actively promoted the research.


2010 ◽  
Vol 97-101 ◽  
pp. 64-68
Author(s):  
Jian Chen ◽  
Jin Wang ◽  
Guo Dong Lu ◽  
Zheng Qi Ling

High- precision and large scale are the developing trend for injection molding machine clamping system .This paper compared the characteristics of three-platen toggle and dual-platen hydraulic clamping system. The key impact factors that effecting plastic parts` precision from clamping system were discussed systematically first time. Based on these analyses, a new clamping system has been proposed and manufactured to improve the plastics parts` precision, including three new technologies: new type dual-platen structure, parallelism adaptive correction technology and numerical controlled hydraulic servo system technology. It has been applied in practical machine successfully, and experiment result proves that it is effective enough to satisfying the high-precision molding of large plastics parts.


2021 ◽  
Author(s):  
Gabriela Chaves ◽  
Danielle Monteiro ◽  
Virgilio José Martins Ferreira

Abstract Commingle production nodes are standard practice in the industry to combine multiple segments into one. This practice is adopted at the subsurface or surface to reduce costs, elements (e.g. pipes), and space. However, it leads to one problem: determine the rates of the single elements. This problem is recurrently solved in the platform scenario using the back allocation approach, where the total platform flowrate is used to obtain the individual wells’ flowrates. The wells’ flowrates are crucial to monitor, manage and make operational decisions in order to optimize field production. This work combined outflow (well and flowline) simulation, reservoir inflow, algorithms, and an optimization problem to calculate the wells’ flowrates and give a status about the current well state. Wells stated as unsuited indicates either the input data, the well model, or the well is behaving not as expected. The well status is valuable operational information that can be interpreted, for instance, to indicate the need for a new well testing, or as reliability rate for simulations run. The well flowrates are calculated considering three scenarios the probable, minimum and maximum. Real-time data is used as input data and production well test is used to tune and update well model and parameters routinely. The methodology was applied using a representative offshore oil field with 14 producing wells for two-years production time. The back allocation methodology showed robustness in all cases, labeling the wells properly, calculating the flowrates, and honoring the platform flowrate.


2021 ◽  
Author(s):  
Khaled M. Mazen Al Khoujah ◽  
Antonio - Medina ◽  
Juma Rashid Al Qaydi ◽  
Jawwad Kaleem ◽  
Fatima Hassan Al Mansoori ◽  
...  

Abstract An innovative design was implemented as a solution for the repetitive failure of a plate heat exchanger installed at Gas Processing Facilitates due to weld cracking, the new design was introduced for the first time in the facility, demonstrating the novelty of utilizing new technologies and enhanced designs in Heat Exchangers used for gas processing. The main challenges were in accommodating various operating modes and ensure the prevention of reoccurrence of the failures. The success was achieved through the collaboration between the operating company and Industry experts in heat transfer equipment to replace the existing design at the gas processing Facilitates with no change in piping layouts, hence, performing the replacement at optimal cost and maximum benefit.


2021 ◽  
Author(s):  
Nagaraju Reddicharla ◽  
Subba Ramarao Rachapudi ◽  
Indra Utama ◽  
Furqan Ahmed Khan ◽  
Prabhker Reddy Vanam ◽  
...  

Abstract Well testing is one of the vital process as part of reservoir performance monitoring. As field matures with increase in number of well stock, testing becomes tedious job in terms of resources (MPFM and test separators) and this affect the production quota delivery. In addition, the test data validation and approval follow a business process that needs up to 10 days before to accept or reject the well tests. The volume of well tests conducted were almost 10,000 and out of them around 10 To 15 % of tests were rejected statistically per year. The objective of the paper is to develop a methodology to reduce well test rejections and timely raising the flag for operator intervention to recommence the well test. This case study was applied in a mature field, which is producing for 40 years that has good volume of historical well test data is available. This paper discusses the development of a data driven Well test data analyzer and Optimizer supported by artificial intelligence (AI) for wells being tested using MPFM in two staged approach. The motivating idea is to ingest historical, real-time data, well model performance curve and prescribe the quality of the well test data to provide flag to operator on real time. The ML prediction results helps testing operations and can reduce the test acceptance turnaround timing drastically from 10 days to hours. In Second layer, an unsupervised model with historical data is helping to identify the parameters that affecting for rejection of the well test example duration of testing, choke size, GOR etc. The outcome from the modeling will be incorporated in updating the well test procedure and testing Philosophy. This approach is being under evaluation stage in one of the asset in ADNOC Onshore. The results are expected to be reducing the well test rejection by at least 5 % that further optimize the resources required and improve the back allocation process. Furthermore, real time flagging of the test Quality will help in reduction of validation cycle from 10 days hours to improve the well testing cycle process. This methodology improves integrated reservoir management compliance of well testing requirements in asset where resources are limited. This methodology is envisioned to be integrated with full field digital oil field Implementation. This is a novel approach to apply machine learning and artificial intelligence application to well testing. It maximizes the utilization of real-time data for creating advisory system that improve test data quality monitoring and timely decision-making to reduce the well test rejection.


2021 ◽  
Vol 134 (3) ◽  
pp. 35-38
Author(s):  
A. M. Svalov ◽  

Horner’s traditional method of processing well test data can be improved by a special transformation of the pressure curves, which reduces the time the converted curves reach the asymptotic regimes necessary for processing these data. In this case, to take into account the action of the «skin factor» and the effect of the wellbore, it is necessary to use a more complete asymptotic expansion of the exact solution of the conductivity equation at large values of time. At the same time, this method does not allow to completely eliminate the influence of the wellbore, since the used asymptotic expansion of the solution for small values of time is limited by the existence of a singular point, in the vicinity of which the asymptotic expansion ceases to be valid. To solve this problem, a new method of processing well test data is proposed, which allows completely eliminating the influence of the wellbore. The method is based on the introduction of a modified inflow function to the well, which includes a component of the boundary condition corresponding to the influence of the wellbore.


2021 ◽  
Vol 11 (21) ◽  
pp. 10080
Author(s):  
Haifeng Zhang ◽  
Mingliang Long ◽  
Huarong Deng ◽  
Shaoyu Cheng ◽  
Zhibo Wu ◽  
...  

Debris laser ranging (DLR) is receiving considerable attention as an accurate and effective method of determining and predicting the orbits of space debris. This paper reports some technologies of DLR, such as the high pulse repetition frequency (PRF) laser pulse, large-aperture telescope, telescope array, multi-static stations receiving signals. DLR with a picosecond laser at the Shanghai Astronomical Observatory is also presented. A few hundred laps of space debris laser-ranging measurements have been made. A double-pulse picosecond laser with an average power of 4.2 W, a PRF of 1 kHz, and a wavelength of 532 nm has been implemented successfully in DLR, it’s the first time that DLR technology has reached a ranging precision at the sub-decimeter level. In addition, the characteristics of the picosecond-pulse-width laser transmission with the advantages of transmission in laser ranging were analyzed. With a mode of the pulse-burst picosecond laser having high average power, the DLR system has tracked small debris with a radar cross-section (RCS) of 0.91 m2 at a ranging distance up to 1726.8 km, corresponding to an RCS of 0.1 m2 at a distance of 1000 km. These works are expected to provide new technologies to further improve the performance of DLR.


Sign in / Sign up

Export Citation Format

Share Document