scholarly journals How modern data management software and state of the art inspection gauges improve the efficiency of the pipeline coating inspection process

2019 ◽  
Vol 121 ◽  
pp. 05001
Author(s):  
David Barnes

The inspection of pipeline coating is crucial to the lifetime performance of the pipeline. Inspection during installation of the pipeline and as part of the routine maintenance programme is essential. It is often said that inspection processes save money by ensuring that relevant specifications are achieved but that writing reports for the inspection process cost money. One way to reduce the cost of inspection reporting and to speed up the inspection process is to use a data management system to present the inspection data in a consistent and organised manner. The automation of the reporting process is an important cost saving that allows more time to be allocated to the important task of inspection and the achievement of the coating specification. There have been recent developments in both the design of reporting software and inspection gauges which together make achieving a paperless quality assurance system a reality for all protective coating applications. This paper describes the latest design and operational features of coating thickness gauges, dewpoint meters, surface profile gauges and other related gauges and describes how data can be easily transferred from the memory of these gauges into personal computers and mobile devices by running a dedicated software program for coating inspection data management. The creation of reports combining test results from a broad range of both digital and non-digital test methods will be discussed with particular emphasis on the use of Standard reports and the preparation of pre-formatted report forms.

2021 ◽  
Vol 20 (10) ◽  
pp. 1861-1873
Author(s):  
Vasilii S. DOSIKOV ◽  
Viktor A. KALMYKOV

Subject. The article addresses the prospective development of the civil shipbuilding sector in Russia. It considers digitalization of design, construction and production as a key driver of increasing the labor productivity at enterprises and organizations of the industry. Objectives. The aim is to present a conceptual model of a digital network-centric data management system for calculating the cost of works (services) performed (provided) by design organizations in the national civil shipbuilding, as a promising tool to improve the efficiency of cost estimation in the industry. Methods. We employ generally accepted methods of scientific knowledge, like analysis, synthesis, formalization and concretization, composition and decomposition, verification and expert evaluation. Results. We formulated and described a model of digital network-centric data management system. It consists of two elements, i.e. a centralized server management system for a cluster database and a client complex. The latter contains a personal calculation module, which is customizable for the needs of the organization, and a local (personal) database. Conclusions. If implemented, the presented conceptual model will enable a new level quality, speed, and accuracy of cost estimates of designed objects, as well as more effective accumulation and processing of relevant data for management decisions.


Author(s):  
Bruce Dupuis ◽  
Jason Humber

For the majority of pipeline operators struggling to establish the business case for data management, records management, or geographic information systems, a step past the traditional information technology approach of return on investment (ROI) must be made. Traditional information technology value propositions are founded on information efficiencies that, for the most part, are extremely difficult to quantify since the processes are either not presently performed or the effort associated with the existing process has not been measured. Without a baseline of the existing process, a comparative analysis using improved efficiencies cannot be quantified to substantiate a return on investment. Justification of a data management system and its associated benefits in terms of its cost relative to the cost of the data it manages (e.g. ILI, excavation, CIS etc.) is compelling since it is only on the order of 2–10%, but typically even this metric is too general an argument for most pipeline integrity managers to feel comfortable defending. This paper will explore the process required to unearth the value of data management to support pipeline integrity. Many examples and cases will be discussed to back-up the approach to establishing value of data management for pipeline integrity.


1999 ◽  
Vol 1999 (1) ◽  
pp. 943-945 ◽  
Author(s):  
Alain Lamarche ◽  
Jack Ion ◽  
Edward H. Owens ◽  
Peter Rubec

ABSTRACT Shoreline Cleanup Assessment Teams (SCAT) are now used worldwide to assess oiled shorelines as part of response cleanup activities. The amount of SCAT information gathered during surveys can be very large, with the possibility of overwhelming decision makers. New tools are now available to automate the processing of SCAT information. For example, dedicated computerized SCAT data management systems have been used during the Iron Baron (Tasmania) and Kure (California) incidents. More recently, a prototype system was developed by the State of Florida to electronically support all the steps involved in the cleanup phase of an oil spill response. Given this, when should computerized SCAT data management be used and at what level? An analysis of the work performed during recent spills involving SCAT activity provided answers to these questions. Some of the main findings include the following: (1) computerized systems can decrease the time necessary to gather data and increase the accuracy of the captured data; (2) computerized systems decrease the data turnover time and speed up the decision-making cycle; (3) an all-electronic computerized system can become essential in cases where the length of oiled shoreline is very large with respect to the number of SCAT survey teams; (4) for large spills, the increased cost of an all-electronic system may outweigh the cost of not being prepared.


2019 ◽  
Vol 4 (Suppl 3) ◽  
pp. A46.1-A46
Author(s):  
Harry Van Loen ◽  
Mary Thiongo ◽  
Yven Van Herrewege

BackgroundAwareness of data management (DM) is often restricted to ‘the cost of computers’ or ‘the need for a database’. Recently, ‘data sharing’ can be added to this shortlist. Indeed, in recent years data sharing became often required or so strongly promoted that the importance of all other aspects related to DM or data handling in clinical tended still to be overlooked. However, the development of data sharing guidelines and associated privacy regulations (e.g. the EU General Data Protection Regulation) created a new momentum for highlighting the importance of qualitative data management.MethodsAn overview of DM processes is given, within the framework and challenges of conducting non-commercial clinical trials in North-South partnerships.ResultsThe DM workflow of a clinical trial is presented, highlighting essential DM tasks, deliverables and milestones. Pre-study tasks and deliverables are addressed: SOPs, a data management plan, the implementation of a GCP-compliant validated data management system and compliance to data quality, privacy, security and standards (e.g. MedDRA, CDISC). Subsequent study-specific processes including the collection, entry, querying and cleaning of the data are discussed. In addition, DM metrics important to guide quality, productivity and timelines are reviewed while considering their impact on post-study activities such as data sharing.ConclusionData sharing is only one of many DM tasks, at the end of the DM workflow. Focusing too much on data sharing while neglecting other DM aspects might lead to underestimating the workload, resources, quality assurance and time needed for data management and by and large for the trial itself. Integrating data sharing into a holistic vision on data management is paramount for clinical research.


2020 ◽  
Vol 3 (2) ◽  
pp. 26
Author(s):  
Yinan Xia ◽  
Geyi Wen ◽  
Fan Zhou ◽  
Gang Chen ◽  
Peng Wang

In order to ensure the safety of buildings, equipment, personnel, property and production, to prevent or mitigate the disasters caused by lightning and static electricity, and to avoid the occurrence of serious lightning accidents, lightning protection inspection (acceptance) is of great significance. At present, in the process of inspection (acceptance) of lightning protection devices, there are following problems: professional and technical equipment for inspection is various and with different performances; the market demand for lightning-proof safety inspection is large, and entrusted units are increasing; the inspection process is difficult to be accurately obtained anytime and anywhere. Therefore, it is proposed to adopt technology of mobile application, wireless network transmission and computer software development to design and develop a data management system for lightning protection inspection based on Android, so as to realize the functions like real-time and comprehensive information management of professional lightning-proof inspection devices and entrusted units and establishment of electronic files for lightning-proof inspection of all projects.


Author(s):  
Helmut Nopper ◽  
Roland Ro¨ßner ◽  
Andre´ Zander

Within the scope of PLEX, a systematic and efficient ageing and plant life management system is becoming more and more important to ensure a safe and economical power plant operation in spite of continuous plant ageing. For the methodical implementation of PLIM & PLEX strategies, AREVA NP has developed the software tool COMSY. This knowledge-based program integrates degradation analysis tools with an inspection data management system. COMSY provides the capability to establish a program guided technical documentation by utilizing a virtual plant model which includes information regarding thermal hydraulic operation, water chemical conditions and materials applied for mechanical components. It provides the option to perform a plant-wide screening for identifying system areas, which are sensitive for degradation mechanisms typically experienced in nuclear power plants (FAC, corrosion fatigue, IGSCC, Pitting, etc.). If a system area is identified as being susceptible to degradation, a detailed analysis function enables the condition-oriented service life evaluation of vessels and piping systems in order to localize and conservatively quantify the effect of degradation. Based on these forecasts with COMSY, specific strategies can be developed to mitigate the effect of degradation and inspection activities can be focused on degradation sensitive areas. In addition, a risk-informed assessment tool serves to optimize inspection activities in respect to degradation potential and the associated damage consequence. After an in-service inspection is performed for a distinct location, the inspection data is to be evaluated according to generally accepted procedures. For this purpose an integrated inspection data management system module provides standardized, interactively operated evaluation functions. The key inspection results are transmitted as feedback in respect to the as-is condition of the component. Subsequently, all further life evaluations of the associated component are calibrated against the inspection results. The compiled condition-oriented knowledge provides the basis for a continuous optimization resulting in tailored inspection and maintenance programs geared to the specific plant. The systematic closed loop process ensures the generation of up-to-date plant documentation relating to the technical as-is status of the plant, as all the data involved in the process are compiled in a “living” documentation structure. The implementation of COMSY in various nuclear power plants has confirmed that systematic plant life management makes good economic sense, as cost reductions can be achieved while increasing the plants availability.


2017 ◽  
Vol 4 (1) ◽  
pp. 62-66
Author(s):  
Luyen Ha Nam

From long, long time ago until nowadays information still takes a serious position for all aspect of life, fromindividual to organization. In ABC company information is somewhat very sensitive, very important. But how wekeep our information safe, well we have many ways to do that: in hard drive, removable disc etc. with otherorganizations they even have data centre to save their information. The objective of information security is to keep information safe from unwanted access. We applied Risk Mitigation Action framework on our data management system and after several months we have a result far better than before we use it: information more secure, quickly detect incidents, improve internal and external collaboration etc.


2014 ◽  
Vol 36 (7) ◽  
pp. 1485-1499 ◽  
Author(s):  
Jie SONG ◽  
Tian-Tian LI ◽  
Zhi-Liang ZHU ◽  
Yu-Bin BAO ◽  
Ge YU

Sign in / Sign up

Export Citation Format

Share Document