SUPDUT: An Environment to Implement Logic Control Applications for Pipelines

Author(s):  
Luiz F. de J. Bernardo ◽  
Eliane A. Cid ◽  
Paulo de T. A. Correia ◽  
Ruy L. Milidiu´ ◽  
Frederico dos S. Liporace

The reliable operation of product transfers in pipelines is essential to the economic results of a pipeline company. This operation heavily depends on calculations performed over real time raw and historical data to assure the expected level of confidence in the operational results. This paper describes the development of a software environment, SUPDUT (abbreviation for the portuguese term Supervisor de Oleodutos, or Pipeline Supervisor), to be used in the development, organization, execution and maintenance of operational applications and to support their communication with other corporate and basic real time systems (SCADA). Application in this context means all kinds of operational or corporate calculations that require information from SCADA. The main advantage of the SUPDUT architecture is that it simplifies the application development and maintenance process, by providing a server that deals with all the complexity related to SCADA communication and application scheduling. The application developer therefore does not need to be concerned with those issues. It also makes the application development independent from the SCADA that collects real time data. The environment is designed to facilitate a simple and rapid implementation of new applications with a minimal impact on the system. Other important SUPDUT environment features are: complete object-oriented design, planned support for distributed applications and reliable application scheduling, support for a wide range of application scheduling options, support for multiple SCADA, support for multiple languages for application development (FORTRAN, C, C++ and Java) and robustness to the addition of new applications. The SUPDUT environment requirements definition and design are completed, and it is in its coding phase as this paper is being written. The first production version of the software is expected to be delivered by the end of 2002.

2021 ◽  
Author(s):  
Christopher White ◽  
Joanne Robbins ◽  
Daniela Domeisen ◽  
Andrew Robertson

<p>Subseasonal-to-seasonal (S2S) forecasts are bridging the gap between weather forecasts and long-range predictions. Decisions in various sectors are made in this forecast timescale, therefore there is a strong demand for this new generation of predictions. While much of the focus in recent years has been on improving forecast skill, if S2S predictions are to be used effectively, it is important that along with scientific advances, we also learn how best to develop, communicate and apply these forecasts.</p><p>In this paper, we present recent progress in the applications of S2S forecasts, and provide an overview of ongoing and emerging activities and initiatives from across the wider weather and climate applications and user communities, as follows:</p><ul><li>To support an increased focus on applications, an additional science sub-project focused on S2S applications has been launched on the World Meteorological Organization WWRP-WCRP S2S Prediction Project: http://s2sprediction.net/. This sub-project will provide a focal point for research focused towards S2S applications by exploring the value of applications-relevant S2S forecasts and highlighting the opportunities and challenges facing their uptake.</li> <li>Also supported by the S2S Prediction Project, the ongoing Real-Time Pilot initiative http://s2sprediction.net/file/documents_reports/16Projects.pdf is making S2S forecasts available to 15 selected projects that are addressing user needs over a two year period (November 2019 through to November 2021). By making this real-time data available, the initiative is drawing on the collective experiences of the researcher and user communities from across the projects. The Real-Time Pilot will develop best practice guidelines for producing useful and useable, application-orientated forecasts and tools that can be used to guide future S2S application development. We will present an update on the initiative, including results from an initial set of questionnaires that focussed on engagement strategies and practices, supporting a review of how projects were designs, the roles and responsibilities of different project participants and the methods used to determine project success.</li> <li>To increase the uptake and use of S2S forecasts more widely across the research and user communities, we present a new initiative: a global network of researchers, modellers and practitioners focused on S2S applications, called S2Sapp.net – a community with a shared aim of exploring and promoting cross-sectoral services and applications of this new generation of predictions.</li> <li>Finally, we will provide an update on a recently-submitted applications community review paper, covering sectoral applications of S2S predictions, including public health, disaster preparedness, water management, energy and agriculture. Drawing from the experience of researchers and users working with S2S forecasts, we explore the value of applications-relevant S2S predictions through a series of sectoral cases where uptake is starting to occur.</li> </ul>


2002 ◽  
Vol 36 (1) ◽  
pp. 29-38 ◽  
Author(s):  
Ray Berkelmans ◽  
Jim C. Hendee ◽  
Paul A. Marshall ◽  
Peter V. Ridd ◽  
Alan R. Orpin ◽  
...  

With recent technological advances and a reduction in the cost of automatic weather stations and data buoys, the potential exists for significant advancement in science and environmental management using near real-time, high-resolution data to predict biological and/or physical events. However, real-world examples of how this potential wealth of data has been used in environmental management are few and far between. We describe in detail two examples where near real-time data are being used for the benefit of science and management. These include a prediction of coral bleaching events using temperature, light and wind as primary predictor variables, and the management of a coastal development where dynamic discharge quality limits are maintained with the aid of wind data as a proxy for turbidity in receiving waters. We argue that the limiting factors for the use of near real-time environmental data in management is frequently not the availability of the data, but the lack of knowledge of the quantitative relationships between biological/physical processes or events and environmental variables. We advocate renewed research into this area and an integrated approach to the use of a wide range of data types to deal with management issues in an innovative, cost-effective manner.


2000 ◽  
Vol 47 (2) ◽  
pp. 132-135 ◽  
Author(s):  
A.L. Veiga ◽  
M.A. Mayosky ◽  
N. Martinez

2021 ◽  
Author(s):  
John McIntosh ◽  
Renata Martin ◽  
Pedro Alcala ◽  
Stian Skjævesland ◽  
John Rigg

Abstract The paper describes a project known internally as "InWell" to address multiple requirements in Repsol Drilling & Completions. InWell is defined by a new Operating Model comprising Governance, People, Process, Functions and Technology. This paper addresses changes to the Technology element - often referred to as "Digitalization". The paper includes a discussion about the business transformation strategy and case studies for addressing three of 18 functionalities identified in the first round of development. The InWell development strategy followed four steps; identification of performance issues, envisioning of a future operating model, identification of functionalities required/supporting this operating model and matching to digital solutions. Our case studies focus on three functionalities provided by three separate companies, Unification of Planning and Compliance, Real Time Data aggregation and Key Performance Indicators. Each functionality was addressed with an existing commercial application customized to meet specific requirements. A corporate web-based Well Construction Process (WCP) was initially piloted and then extended to include all well projects. The WCP identifies the key Tasks that must be completed per project, and these are all tracked. Data from this application is used by a third-party Business Analytics application via an API. Real time data from many sites and a wide range of sources was aggregated and standardized, Quality Controlled and stored within a private secure cloud. The data collation service is an essential building block for current third-party applications such as the operating centre and is a prerequisite for the goal of increased automation. A suite of Operator specific Key Performance Indicators (KPIs) and data analytics services were developed for drilling and completions. Homogenized KPIs for all business units provide data for objective performance management and apples-to-apples comparison. Results are presented via custom dashboards, reports, and integrations with third party applications to meet a wide range of requirements. During a four-month Pilot Phase the InWell Project delivered € 2.5 million in tangible savings through improvements in operational performance. In the first 12 months € 16 million in savings were attributed to InWell. By 2022 forecast savings are expected to exceed € 60 million (Figures 1 & 2). The value of Intangible benefits is thought to exceed these objective savings. Figure 1 The Business Case for InWell – Actual & Projected Savings and Costs. Figure 2 InWell Services addressing Value Levers and quantified potential impact. A multi-sourced digital strategy can produce quick gains, is easily adapted, and provides high value at low risk. The full benefit of digital transformation can only be realised when supported by an effective business operating model.


2020 ◽  
Author(s):  
Terry Hock ◽  
Tammy Weckwerth ◽  
Steve Oncley ◽  
William Brown ◽  
Vanda Grubišić ◽  
...  

<p>The National Center for Atmospheric Research Earth Observing Laboratory (EOL) proposes to develop the LOwer Troposphere Observing System (LOTOS), a new integrated sensor network that offers the potential for transformative understanding of the lower atmosphere and its coupling to the Earth's surface. </p><p> </p><p>The LOTOS sensor network is designed to allow simultaneous and coordinated sampling both vertically, through the atmospheric planetary boundary layer, and horizontally, across the surrounding landscape, focusing on the land-atmosphere interface and its coupling with the overlying free troposphere. The core of LOTOS will be a portable integrated network of up to five nodes, each consisting of a profiling suite of instruments surrounded by up to fifteen flux measuring towers. LOTOS will provide an integrated set of measurements needed to address outstanding scientific challenges related to processes within the atmospheric surface layer, boundary layer, and lower troposphere. LOTOS will also enable novel quantification of exchanges of biogeochemical and climate-relevant gases from microscale up to regional scale. </p><p> </p><p>LOTOS’ uniqueness lies in its ability to simultaneously sample both horizontally and vertically as an integrated system, but also in its flexibility to be easily relocated as a portable field-deployable system suitable for addressing a wide range of research needs. LOTOS will provide real-time data quality control, combine measurements from a variety of sensors into integrated data products, and provide real-time data displays. It is envisioned that LOTOS will become part of the deployable NSF Lower Atmosphere Observing Facilities (LAOF) and thus be available to a broad base of NSF users from both observational and modeling communities. LOTOS offers the potential for transformative understanding of the Earth and its atmosphere as a coupled system. This presentation will describe the background, motivation, plan, and timeline for the LOTOS’ proposed development.</p>


Author(s):  
Tommi Inkinen ◽  
Reima Helminen ◽  
Janne Saarikoski

Digitalization is frequently addressed in recent economic and social scientific literature. This paper applies a distinction to digital data (raw data) and digital technologies (including both software platforms and hardware solutions). The open data is defined as follows: it is publicly available and non-chargeable data (information content) that is machine readable. Open data enables software and application development for external partners and users. A common feature in open-data applications is location-based identification (e.g., real-time traffic monitoring). These include spatial map visualizations, and monitoring of traffic and modes of transport. This visualized information provides additional support for data-based decision-making and management as these study results indicate. This information is valuable particularly in the decisions concerning unconventional and sudden events. This research indicates that the most suitable data resources for opening include information related to port transport infrastructure. In terms of temporal monitoring, static road and rail data is currently the most potential alternative for open data in ports. The main reasons are that these data sources are already at least partly published. However, they are not always in open-data formats. Static data is also a grounded starting point because the technical requirements are much less demanding in comparison to real-time data-processing and management


2012 ◽  
Vol 546-547 ◽  
pp. 481-485
Author(s):  
Ye Hui Liu

The common data acquisition system which is proposed in this paper has achieved features of monitoring layer and data collection layer (Complete the construction of ARM9 system board hardware and software environment; implement the design of STM8L acquisition board; design such application modules as host MODBUS, slave MODBUS, parameter setting, real-time data communications, real-time graphical display, local data storage, ADC, digital filtering and so on). Judging from the system test, it can achieve the desired effect, and especially, STM8L ADC data acquisition board has highly precise and consumes low power.


2013 ◽  
Vol 711 ◽  
pp. 629-635 ◽  
Author(s):  
Basem Almadani ◽  
Anas Al-Roubaiey ◽  
Rashad Ahmed

In recent years, there has been a growth in the amount of data used to improve the production processes. As a result, the cross communication and interaction between components has increased and became a key property of modern production system. To reduce the cost of communication and to increase the efficiency of these systems, middleware technology such as CORBA, COM+, Java RMI, and Web services are being used. Middleware is becoming more and more the essential factor in improving current Manufacturing Process Automation and Control Systems, and their applications have witnessed an increased demand in real time Manufacturing Distributed Applications. In this paper, we propose publish-subscribe middleware architecture based on the Data Distribution System (DDS) open standard. The proposed architecture aims to seamlessly integrate the existing heterogeneous manufacturing systems, such as SCADA system, DCS, PLCs, and databases; and improve the communication and real time data delivery among system components; also, it provides a QoS support layer between the communicating components. Furthermore, the proposed architecture is implemented and evaluated by intensive experiments.


2007 ◽  
Vol 10 (03) ◽  
pp. 241-250 ◽  
Author(s):  
Hani Elshahawi ◽  
Mohamed Naguib Hashem ◽  
Daniel McKinney ◽  
Mario Ardila ◽  
Cosan Ayan

Summary Modern wireline formation testers (WFTs) are able to collect a massive amount of data at multiple depths, thus helping to quantify changes in rock and fluid properties along the wellbore, to define hydraulic flow units, and to understand the reservoir architecture. They are being used routinely in a wide range of applications spanning pressure and mobility profiling vs. depth, fluid sampling, downhole fluid analysis (DFA), interval pressure-transient testing (IPTT), and microfracturing. Because of the complex tool strings and the elaborate operational aspects involved in wireline formation testing, success requires detailed upfront planning and procedural design as well as real-time operational and interpretational support. It is becoming increasingly critical for operating and service company experts to remotely monitor and interpret WFT surveys in real time through Web-based systems. The importance of meeting all rock and fluid data-acquisition objectives cannot be overstated, given the high cost of offshore operations and the implications of obtaining false or misleading information. The main objective of real-time monitoring remains to assure that the planned data are acquired according to pre-established procedures and contingency plans. However, even in developed reservoirs, unexpected circumstances arise, requiring immediate response and modifications to the preplanned job procedures. Unexpectedly low or high mobilities, probe plugging, unanticipated fluid types, the presence of multiple phases, and excessive fluid contamination are but a few examples of such circumstances that would require real-time decision making and procedural modifications. Real-time decisions may include acquiring more pressure data points, extending sampling depths to several zones, extending or shortening sampling times, and repeating microhydraulic fracture reopening/closure cycles, as well as real-time permeability, composition, or anisotropy interpretation to determine optimum transient durations. This paper describes several examples of formation tester surveys that have been remotely monitored in real time to ensure that all WFT evaluation objectives are met. The power of real-time monitoring and interpretation will be illustrated through these case studies. Introduction WFT has become a standard part of the evaluation program of most newly drilled wells, but the objectives vary from offshore deepwater exploration and appraisal wells to old cased-hole and development wells later in the life of a field. Given the wide range of applications and combinations, each WFT evaluation program is unique. Some may include only a pressure-gradient survey for reservoir depletion and communication information, whereas others may seek information on the precise nature of the hydrocarbon fluids and water in terms of chemical and physical properties, phase behavior, and commingling tendencies. Cased-hole surveys might look for bypassed hydrocarbon zones or have objectives that could not be achieved during the openhole phase. Regardless of the type of survey performed, understanding the exploration and appraisal or field-development objectives and translating these into acquisition objectives is essential for success. Figs. 1 and 2 schematically illustrate the real-time monitoring concept. Real-time data are viewable by authorized personnel anywhere around the world, thus allowing virtual collaboration between field staff and off-site service- and operating-company experts throughout the operation. This paper includes several examples of WFT surveys that were monitored and supervised in real time. The cases presented span the entire spectrum of WFT applications including pressures, gradients, sampling, downhole fluid analysis (DFA), IPTT, and microfracturing. The power of real time monitoring and interpretation is clearly illustrated by these examples.


Sign in / Sign up

Export Citation Format

Share Document