Developments in Pipeline Instrumentation

1987 ◽  
Vol 20 (1) ◽  
pp. 7-17 ◽  
Author(s):  
R A Furness

Pipelines are an integral part of the world's economy and literally billions of pounds worth of fluids are moved each year in pipelines of varying lengths and diameters. As the cost of some of these fluids and the price of moving them has increased, so the need to measure the flows more accurately and control and operate the line more effectively has arisen. Instrumentation and control equipment has developed steadily in the past decade but not as fast as the computers and microprocessors that are now a part of most large scale pipeline systems. It is the interfacing of the new generation of digital and sometimes ‘intelligent’ instrumentation with smaller and more powerful computers that has led to a quiet but rapid revolution in pipeline monitoring and control. This paper looks at the more significant developments from the many that have appeared in the past few years and attempts to project future trends in the industry for the next decade.

Author(s):  
David C. Joy

Personal computers (PCs) are a powerful resource in the EM Laboratory, both as a means of automating the monitoring and control of microscopes, and as a tool for quantifying the interpretation of data. Not only is a PC more versatile than a piece of dedicated data logging equipment, but it is also substantially cheaper. In this tutorial the practical principles of using a PC for these types of activities will be discussed.The PC can form the basis of a system to measure, display, record and store the many parameters which characterize the operational conditions of the EM. In this mode it is operating as a data logger. The necessary first step is to find a suitable source from which to measure each of the items of interest. It is usually possible to do this without having to make permanent corrections or modifications to the EM.


2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.


Sensors ◽  
2021 ◽  
Vol 21 (10) ◽  
pp. 3515
Author(s):  
Sung-Ho Sim ◽  
Yoon-Su Jeong

As the development of IoT technologies has progressed rapidly recently, most IoT data are focused on monitoring and control to process IoT data, but the cost of collecting and linking various IoT data increases, requiring the ability to proactively integrate and analyze collected IoT data so that cloud servers (data centers) can process smartly. In this paper, we propose a blockchain-based IoT big data integrity verification technique to ensure the safety of the Third Party Auditor (TPA), which has a role in auditing the integrity of AIoT data. The proposed technique aims to minimize IoT information loss by multiple blockchain groupings of information and signature keys from IoT devices. The proposed technique allows IoT information to be effectively guaranteed the integrity of AIoT data by linking hash values designated as arbitrary, constant-size blocks with previous blocks in hierarchical chains. The proposed technique performs synchronization using location information between the central server and IoT devices to manage the cost of the integrity of IoT information at low cost. In order to easily control a large number of locations of IoT devices, we perform cross-distributed and blockchain linkage processing under constant rules to improve the load and throughput generated by IoT devices.


1982 ◽  
Vol 1 (18) ◽  
pp. 97
Author(s):  
J. Zacks

The cost of many coastal projects is often increased by the expensive beach repair and maintenance required to remedy the destabilising effects of structures on the adjoining coastline. Physical and/or mathematical models have been developed for use in planning these projects in order to predict and quantify the effects of marine sediment transport on the coastal topography. Such models need to be calibrated against prototype data and one method of gauging volumetric sediment movement is by successive bathymetric/ topographic profiting surveys which are performed seasonally and annually. Since large quantities of sediment are related to small changes in bed elevation it is clear that this profiling needs to be done with the utmost precision* The areas most affected extend from the beach through the surf zone to water depths of about 25 metres. The surf zone in particular is a dynamic and hostile area which falls outside the traditional activities of both the hydrographic and land surveyors. Consequently innovative methods, deficient in sound survey principle and practice, have often been pursued in this area without any attempt being made to assess the tolerance on the data. This paper attempts to show that it is possible to produce reliable and verifiable results to the required accuracy by using conventional survey equipment and techniques, also by taking the necessary precautions against the many possible sources of survey error. The procedures and techniques described have evolved from NRIO's involvement over the past decade in major projects at Richards Bay, Durban, Koeberg and in False Bay. The results of a recent verification investigation are fully reported in this paper.


2019 ◽  
Vol 5 (15) ◽  
pp. 1448-1455
Author(s):  
Venelin Terziev ◽  
Teodora Petrova

The non-motorized air systems for intelligence, monitoring and control of the earth surface have gained currency and are used for various tactic flight’s tasks and missions. The non-motorized aircrafts (NMA) and the air-monitoring systems that include board and land part are key elements of these systems. The world experience in using NMA for these uses shows that they are most suitable where the exploitation conditions are very extreme and there is an unacceptable risk for operations of piloted aviation. Such are intelligence and observation of strictly guarded sites, zones, where military operations are conducted as well as regions with large scale fires and floods. The use of people in these conditions is connected with actual threat for their lives and practically, NMA as a tool for collecting and processing of information is irreplaceable. Keywords: registration of images, methods, information systems, non-motorized aircrafts.


Author(s):  
Ігор Бережний ◽  
◽  
Адріан Наконечний ◽  

Based on the research and comparative analysis of existing systems, an algorithm for remote monitoring and control of the technological process using IoT technologies is proposed and developed. We consider a system with flexible algorithms, which combines different data protocols using Wi-Fi technology, which allows you to use this type of system in any industry safely with high speed, energy efficiency and without the cost of communication lines.


1995 ◽  
Vol 387 ◽  
Author(s):  
Chi Yung Fu ◽  
Loren Petrich ◽  
Benjamin Law

AbstractThe cost of a fabrication line, such as one in a semiconductor house, has increased dramatically over the years, and it is possibly already past the point that some new start-up company can have sufficient capital to build a new fabrication line. Such capital-intensive manufacturing needs better utilization of resources and management of equipment to maximize its productivity. In order to maximize the return from such a capital-intensive manufacturing line, we need to work on the following: 1) increasing the yield, 2) enhancing the flexibility of the fabrication line, 3) improving quality, and finally 4) minimizing the down time of the processing equipment. Because of the significant advances now made in the fields of artificial neural networks, fuzzy logic, machine learning and genetic algorithms, we advocate the use of these new tools in manufacturing. We term the applications to manufacturing of these and other such tools that mimic human intelligence neural manufacturing. This paper describes the effort at the Lawrence Livermore National Laboratory (LLNL) [1] to use artificial neural networks to address certain semiconductor process modeling, monitoring and control questions.


2013 ◽  
Vol 791-793 ◽  
pp. 857-860
Author(s):  
Hong Chun Yao ◽  
Liang Yin ◽  
Ming Yu Huang

According to the development status of large-scale PV power station, this paper expatiates the topology and functions of the monitoring and control system of large-scale PV power station, The system consists of monitoring center, fiber optic ring network and control subnet, and the main functions are SCADA (supervisory control and data acquisition), AGC (automatic generation control), AVC (automatic voltage control), system state assessment, equipment maintenance, and power generation benefit analysis. The system is flexible and reliable, and will promote the efficiency of operation and maintenance of large-scale PV power station.


Author(s):  
Yan Xunshi ◽  
Zhou Yan ◽  
Zhao Jingjing ◽  
Sun Zhe ◽  
Shi Zhengang

In order to cut cost and simplify redevelopment, the paper proposes a new design for monitoring and control system in active magnetic bearing. Data sampling and processing module is separated with data display module. Singlechip processor is used to sample data from the DSP controller, and processes the data into the needed form. A touch-screen only for display takes on the processed data to customers. The design makes the systems easy to develop and shrink the cost to one tenth of previous design.


Author(s):  
Bernhard Rieder ◽  
Òscar Coromina ◽  
Ariadna Matamoros-Fernández

Over the past 15 years, YouTube has emerged as a large and dominant social media service, giving rise to a ‘platformed media system’ within its technical and regulatory infrastructures. This paper relies on a large-scale sample of channels (n=36M+) to explore this media system along three main lines. First, we investigate stratification and hierarchization in broadly quantitative terms, connecting to well-known tropes on structural hierarchies emerging in networked systems, where a small number of elite actors often dominate visibility. Second, we inquire into YouTube’s channel categories, their relationships, and their proportions as a means to better understand the topics on offer and their relative importance. Third, we analyze channels according to country affiliation to gain insights into the dynamics and fault lines that align with country and language. Throughout the paper, we emphasize the inductive character of this research, by highlighting the many follow-up questions that emerge from our findings.


Sign in / Sign up

Export Citation Format

Share Document