Large Standoff Magnetometry As a Practical Screening and Monitoring Tool for Pipelines Subjected to Geohazards

Author(s):  
Tianzong (David) Xu

Abstract Large standoff magnetometry (LSM) as a non-intrusive NDE technology has been used many years in commercial trials for above-ground detection of underground pipeline anomalies associated with stress concentration zones (SCZs). As a passive geo-magnetization flux leakage measurement method, it has been mainly targeting common anomalies such as corrosion, gouges/dents and cracks that are often very localized in small scale. Insufficient consistency and reliability are still the major concern due to technical challenges in getting high resolutions and signal strength at large standoff distance. In comparison, geohazard related external forces induce much large-scale elevated stresses/strains with stronger stress-magnetization signals. Lack of economically viable solutions for pipeline screening and monitoring under geohazard conditions provides a good opportunity to establish LSM’s market position. This work is part of Pacific Gas and Electric Company’s effort in better fundamental understanding of the state-of-the-art LSM technology and its potential to enhance the current industrial practices of pipeline assessment under geohazard conditions. Specifically, 3D mapping of pipelines including depth of cover (DOC) measurement, locating girth welds and peak stresses/strains with risk rating, verification of strain relief operation and continuous monitoring afterwards. Inline inspection (ILI) and geotechnical analysis data together with field excavation and strain-gauge data are utilized as references to cross-check the LSM results. The outcomes indicate geohazard application is very likely a viable application for LSM technology in near future.

Author(s):  
Anjan Pakhira ◽  
Peter Andras

Testing is a critical phase in the software life-cycle. While small-scale component-wise testing is done routinely as part of development and maintenance of large-scale software, the system level testing of the whole software is much more problematic due to low level of coverage of potential usage scenarios by test cases and high costs associated with wide-scale testing of large software. Here, the authors investigate the use of cloud computing to facilitate the testing of large-scale software. They discuss the aspects of cloud-based testing and provide an example application of this. They describe the testing of the functional importance of methods of classes in the Google Chrome software. The methods that we test are predicted to be functionally important with respect to a functionality of the software. The authors use network analysis applied to dynamic analysis data generated by the software to make these predictions. They check the validity of these predictions by mutation testing of a large number of mutated variants of the Google Chrome. The chapter provides details of how to set up the testing process on the cloud and discusses relevant technical issues.


2015 ◽  
pp. 1175-1203
Author(s):  
Anjan Pakhira ◽  
Peter Andras

Testing is a critical phase in the software life-cycle. While small-scale component-wise testing is done routinely as part of development and maintenance of large-scale software, the system level testing of the whole software is much more problematic due to low level of coverage of potential usage scenarios by test cases and high costs associated with wide-scale testing of large software. Here, the authors investigate the use of cloud computing to facilitate the testing of large-scale software. They discuss the aspects of cloud-based testing and provide an example application of this. They describe the testing of the functional importance of methods of classes in the Google Chrome software. The methods that we test are predicted to be functionally important with respect to a functionality of the software. The authors use network analysis applied to dynamic analysis data generated by the software to make these predictions. They check the validity of these predictions by mutation testing of a large number of mutated variants of the Google Chrome. The chapter provides details of how to set up the testing process on the cloud and discusses relevant technical issues.


2022 ◽  
Vol 2161 (1) ◽  
pp. 012001
Author(s):  
Harshit Sharma ◽  
G Sumathi

Abstract The Covid -19 is arguably the biggest pandemic in history and there are a lot of challenges that must be dealt with. One of the biggest challenges post Covid-19 is to tackle quality control challenges. This research paper discusses some of these challenges and solutions using an integrated internet of things (IoT) and internet of protocols (IoP) based approach and further showing its implementation in the industry world and hence, proving to be a solution for damage assessment. With the help of IoT- enabled quality control system, six-sigma rule is also analysed. Post Covid crisis, it is important for every institution to gain back customer trust so quality of materials should be maintained and IoT enables us to do the same. The unification of industrial IoT (IIoT) and industry 4.0 is also discussed as it leads us to understand that this unification is the next evolution of smart manufacturing and digital technologies. This methodology can lead us to accelerated innovation in applications for overcoming the eventual challenges post Covid in the near future. Also, small-scale/large-scale companies making use of the above research methodology can adhere to six-sigma criterion.


2019 ◽  
Vol 35 (5) ◽  
pp. 627-639 ◽  
Author(s):  
S. R. Wu ◽  
T. H. Chen ◽  
H. Y. Tsai

ABSTRACTOrigami, the ancient paper folding art has inspired the engineering equipment and design for decades. The basic concept of origami is very general, which leads to applications ranging from small scale to large scale. Recently, researchers are interested in being able to create self-folding structures. Such a structure enables kinematic manipulation by external forces or moments without folding and/or unfolding operations. This is a beneficial application for many fields including aerospace systems, robots, small devices and self-assembly systems. In this paper, the investigation and analyses of the previous literatures on the key driving force of the actuation structure, including the heat, light, electricity, gas and other actuation methods. The aims are to provide researchers and practitioners with the support to systematically understand the latest technologies in this important and evolving field, with inspiration and direction for follow-up.


1995 ◽  
Vol 35 (1) ◽  
pp. 436 ◽  
Author(s):  
G.T. Cooper

The Eastern Otway Basin exhibits two near-or-thogonal structural grains, specifically NE-SW and WNW-ESE trending structures dominating the Otway Ranges, Colac Trough and Torquay Embayment. The relative timing of these structures is poorly constrained, but dip analysis data from offshore seismic lines in the Torquay Embayment show that two distinct structural provinces developed during two separate extensional episodes.The Snail Terrace comprises the southern structural province of the Torquay Embayment and is characterised by the WNW-ESE trending basin margin fault and a number of small scale NW-SE trending faults. The Torquay Basin Deep makes up the northern structural province and is characterised by the large scale, cuspate Snail Fault which trends ENE-WSW with a number of smaller NE-SW trending faults present.Dip analysis of basement trends shows a bimodal population in the Torquay Embayment. The Snail Terrace data show extension towards the SSW (193°), but this trend changes abruptly to the NE across a hinge zone. Dip data in the Torquay Basin Deep and regions north of the hinge zone show extension towards the SSE (150°). Overall the data show the dominance of SSE extension with a mean vector of 166°.Seismic data show significant growth of the Crayfish Group on the Snail Terrace and a lesser growth rate in the Torquay Basin Deep. Dip data from the Snail Terrace are therefore inferred to represent the direction of basement rotation during the first phase of continental extension oriented towards the SSW during the Berriasian-Barremian? (146-125 Ma). During this phase the basin margin fault formed as well as NE-SW trending ?transtensional structures in the Otway Ranges and Colac Trough, probably related to Palaeozoic features.Substantial growth along the Snail Fault during the Aptian-Albian? suggests that a second phase of extension affected the area. The Colac Trough, Otway Ranges, Torquay Embayment and Strzelecki Ranges were significantly influenced by this Bassian phase of SSE extension which probably persisted during the Aptian-Albian? (125-97 Ma). This phase of extension had little effect in the western Otway Basin, west of the Sorrel Fault Zone, and was largely concentrated in areas within the northern failed Bass Strait Rift. During the mid-Cretaceous parts of the southern margin were subjected to uplift and erosion. Apatite fission track and vitrinite reflectance analyses show elevated palaeotemperatures associated with uplift east of the Sorell Fault Zone.


2019 ◽  
Vol 76 (6) ◽  
pp. 1601-1609 ◽  
Author(s):  
Tania Mendo ◽  
Sophie Smout ◽  
Tommaso Russo ◽  
Lorenzo D’Andrea ◽  
Mark James

Abstract Analysis of data from vessel monitoring systems and automated identification systems in large-scale fisheries is used to describe the spatial distribution of effort, impact on habitats, and location of fishing grounds. To identify when and where fishing activities occur, analysis needs to take account of different fishing practices in different fleets. Small-scale fisheries (SSFs) vessels have generally been exempted from positional reporting requirements, but recent developments of compact low-cost systems offer the potential to monitor them effectively. To characterize the spatial distribution of fishing activities in SSFs, positions should be collected with sufficient frequency to allow detection of different fishing behaviours, while minimizing demands for data transmission, storage, and analysis. This study sought to suggest optimal rates of data collection to characterize fishing activities at appropriate spatial resolution. In a SSF case study, on-board observers collected Global Navigation Satellite System (GNSS) position and fishing activity every second during each trip. In analysis, data were re-sampled to lower temporal resolutions to evaluate the effect on the identification of number of hauls and area fished. The effect of estimation at different spatial resolutions was also explored. Consistent results were found for polling intervals <60 s in small vessels and <120 in medium and large vessels. Grid cell size of 100 × 100 m resulted in best estimations of area fished. Remote collection and analysis of GNSS or equivalent data at low cost and sufficient resolution to infer small-scale fisheries activities. This has significant implications globally for sustainable management of these fisheries, many of which are currently unregulated.


Author(s):  
Gunay Vagifgiz

Oil and gas deposits differ depending on the bed size, geological-physical development conditions, oil quality and geographic location. Including them in the development is connected with various investments to the main constructions; subsistence and current material expenses also differ. Therefore, from the point of view of economic efficiency, oil and gas deposits are not equal. Location of oil and industry leads to the problem of the sequence of putting of various deposits into operation and their development rate. The sizes of oil and gas beds and available oil and gas reserves in them give reason to say which of these beds will be put into operation in the near future. Completion and development of large scale deposits require less investments compared to small scale deposits. Such deposits are usually highly productive, expenses per a production unit in them is small. All these determined importance of the use of reserves in large scale deposits in the first turn.


2020 ◽  
Vol 218 ◽  
pp. 03053
Author(s):  
Shaomin Yan ◽  
Guang Wu

The current COVID-19 pandemic creates the biggest health and economic challenges to the world. However, not much knowledge is available about this coronavirus, SARS-CoV-2, because of its novelty. Indeed, it necessarily knows the fate of proteins generated by SARS-CoV-2. Anyway, before a large-scale study on proteins from SARS-CoV-2, it would be better to conduct a small-scale study on a well-known protein from influenza A viruses, because both are positive-sense RNA viruses. Thus, we applied a simple method of amino-acid pair probability to analyze 94 neuraminidases of influenza A viruses for better understanding of their fate. The results demonstrate three features of these neuraminidases: (i) the N1 neuraminidases are more susceptible to mutations, which is the current state of the neuraminidases; (ii) the N1 neuraminidases have undergone more mutations in the past, which is the history of the neuraminidases; and (iii) the N1 neuraminidases have a larger potential towards future mutations, which is the future of the neuraminidases. Moreover, our study reveals two clues on the mutation tendency, i.e. the mutations represent a degeneration process, and chickens, ducks and geese are rendered more susceptive to mutation. We hope to apply this approach to study the proteins from SARS-CoV-2 in near future.


1996 ◽  
Vol 14 (7) ◽  
pp. 753-766
Author(s):  
G. Cautenet ◽  
D. Gbe

Abstract. The development of cirrus clouds is governed by large-scale synoptic movements such as updraft regions in convergence zones, but also by smaller scale features, for instance microphysical phenomena, entrainment, small-scale turbulence and radiative field, fall-out of the ice phase or wind shear. For this reason, the proper handling of cirrus life cycles is not an easy task using a large-scale model alone. We present some results from a small-scale cirrus cloud model initialized by ECMWF first-guess data, which prove more convenient for this task than the analyzed ones. This model is Starr\\'s 2-D cirrus cloud model, where the rate of ice production/destruction is parametrized from environmental data. Comparison with satellite and local observations during the ICE89 experiment (North Sea) shows that such an efficient model using large-scale data as input provides a reasonable diagnosis of cirrus occurrence in a given meteorological field. The main driving features are the updraft provided by the large-scale model, which enhances or inhibits the cloud development according to its sign, and the water vapour availability. The cloud fields retrieved are compared to satellite imagery. Finally, the use of a small-scale model in large-scale numerical studies is examined.


2021 ◽  
Author(s):  
Filipe Zimmer Dezordi ◽  
Tulio de Lima Campos ◽  
Pedro Miguel Carneiro Jeronimo ◽  
Cleber Furtado Aksenen ◽  
Suzana Porto Almeida ◽  
...  

The COVID-19 pandemic, a disease caused by the Severe Acute Respiratory Syndrome coronavirus 2 (SARS-CoV-2), emerged in 2019 and quickly spread worldwide. Genomic surveillance has become the gold standard methodology to monitor and study this emerging virus. The current deluge of SARS-CoV-2 genomic data being generated worldwide has put additional pressure on the urgent need for streamlined bioinformatics workflows for data analysis. Here, we describe a workflow developed by our group to process and analyze large-scale SARS-CoV-2 Illumina amplicon sequencing data. This workflow automates all the steps involved in SARS-CoV-2 genomic analysis: data processing, genome assembly, PANGO lineage assignment, mutation analysis and the screening of intrahost variants. The workflow presented here (https://github.com/dezordi/ViralFlow) is available through Docker or Singularity images, allowing implementation in laptops for small scale analyses or in high processing capacity servers or clusters. Moreover, the low requirements for memory and CPU cores makes it a versatile tool for SARS-CoV-2 genomic analysis.


Sign in / Sign up

Export Citation Format

Share Document