Developing a Full Risk Picture for Gas Pipeline Consequences

Author(s):  
Ken E. Oliphant ◽  
David A. Joyal ◽  
Vida Meidanshahi

Properly characterizing the consequences of pipeline incidents is a critical component of assessing pipeline risk. Previous research has shown that these consequences follow a Pareto type distribution for gas distribution, gas transmission and hazardous liquid pipelines where low probability – high consequence (LPHC) events dominate the risk picture. This behavior is driven by a combination of deterministic (e.g. pipe diameter, pressure, location factors, etc.) and random factors (e.g. receptor density at specific time of release, variable environmental factors at time of release, etc.). This paper examines how the Pareto type behavior of the consequences of pipeline incidents arises and demonstrates how this behavior can be modeled through the use of a quantitative pipeline risk model. The result is a more complete picture of pipeline risk, including insight into LPHC events. Use of the modelling approach for integrity management is discussed.

Author(s):  
Poppy M. Jeffries ◽  
Samantha C. Patrick ◽  
Jonathan R. Potts

AbstractMany animal populations include a diversity of personalities, and these personalities are often linked to foraging strategy. However, it is not always clear why populations should evolve to have this diversity. Indeed, optimal foraging theory typically seeks out a single optimal strategy for individuals in a population. So why do we, in fact, see a variety of strategies existing in a single population? Here, we aim to provide insight into this conundrum by modelling the particular case of foraging seabirds, that forage on patchy prey. These seabirds have only partial knowledge of their environment: they do not know exactly where the next patch will emerge, but they may have some understanding of which locations are more likely to lead to patch emergence than others. Many existing optimal foraging studies assume either complete knowledge (e.g. Marginal Value Theorem) or no knowledge (e.g. Lévy Flight Hypothesis), but here we construct a new modelling approach which incorporates partial knowledge. In our model, different foraging strategies are favoured by different birds along the bold-shy personality continuum, so we can assess the optimality of a personality type. We show that it is optimal to be shy (resp. bold) when living in a population of bold (resp. shy) birds. This observation gives a plausible mechanism behind the emergence of diverse personalities. We also show that environmental degradation is likely to favour shyer birds and cause a decrease in diversity of personality over time.


Author(s):  
Sean Keane ◽  
Karmun Cheng ◽  
Kaitlyn Korol

In-line inspection (ILI) tools play an important role within integrity management and substantial investment is made to continuously advance performance of the existing technologies and, where necessary, to develop new technologies. Performance measurement is typically focused for the purpose of understanding the measured performance in relation to the ILI vendor specification and for the determination of residual uncertainty regarding pipeline integrity. These performance measures may not provide the necessary insight into what type of investment into a technology is necessary to further reduce residual uncertainty regarding pipeline integrity, and beyond that, what investment, as an operator, results in an effective and efficient reduction in uncertainty. The paper proposes a reliability based approach for investigating uncertainty associated with ultrasonic crack ILI technology for the purpose of identifying efficient investment into the technology that results in an effective and measurable improvement. Typical performance measures and novel performance measurement methods are presented and reviewed with respect to what information they can provide to assist in investment decisions. Finally, general observations are made regarding Enbridge’s experience using ultrasonic crack ILI technology and areas currently being investigated.


Author(s):  
Larry C. Decker

Recent efforts to develop a consistent approach to understanding the risk associated with operating a cross country pipeline have focused primarily on the pipe itself. Integrity management plans often include a prioritized risk profile that all but ignores the specific risks associated with operating tank farms, terminals, pumps and compression. This paper outlines a detailed logical approach that can be utilized to evaluate the relative safety, environmental and cost risk associated with operating diverse types of equipment within a pipeline station. Topics covered include the basic objectives of a facility risk model while providing the detail (granulation) necessary to continuously improve. A specific methodology is suggested as a systematic tactic to make an “apples to apples” comparison of diverse stations, lines and types of equipment, from a risk standpoint.


Author(s):  
Shahani Kariyawasam ◽  
Hong Wang

The objective of an effective corrosion management program is to identify and mitigate corrosion anomalies before they reach critical limit states. Often as there are many anomalies on pipelines an optimized program will mitigate the few corrosion anomalies that may grow to a critical size within the next inspection interval, without excavating many of the anomalies that will not grow to a critical size. This optimization of the inspection interval and the selection of anomalies to mitigate depend on understanding of corrosion growth. Prediction of corrosion growth is challenging because growth with time is non linear and highly location specific. These characteristics make simplistic approaches such as using maximum growth rates for all defects impractical. Therefore it is important to understand the salient aspects of corrosion growth so that appropriate decisions on excavation and re-inspection can be made without compromising safety or undertaking undue amounts of mitigative activities. In the pipeline industry corrosion growth between two in line inspections (ILIs) has been measured by comparing one ILI run to the next. However many types of ILI comparison methodologies have been used in the past. Within the last decade or two comparison techniques have evolved from box matching of defect samples to signal matching of the total defect populations. Multiple comparison analyses have been performed on the TransCanada system to establish corrosion growth rates. Comparison of the results from these various analyses gives insight into the accuracy and uncertainty of each type of estimate. In an effective integrity management process the best available corrosion growth data should be used. To do so it is important to understand the conservatism and the uncertainty involved in each type of estimate. When using a run-comparison to predict future growth it is assumed that the growth within the last ILI interval will continue (with associated uncertainty) during the next inspection interval. The validity of these assumptions is examined in this study. In the context of this paper these assumptions are validated for external corrosion on onshore pipelines. Characteristics of internal and offshore corrosion are very different in space and time variation. Correlations of external corrosion growth in onshore pipelines with defect size and location are also examined. Learning from multiple corrosion growth studies gives insight into the actual corrosion rate variation along a pipeline as well as general growth characteristics. Different types of corrosion growth modeling for use in probabilistic or deterministic integrity management programs are also discussed.


Author(s):  
Andrew Francis ◽  
Marcus McCallum ◽  
Menno T. Van Os ◽  
Piet van Mastrigt

External Corrosion Direct Assessment (ECDA) has now become acknowledged, by the Office of Pipeline Safety (OPS) in North America, as a viable alternative to both in-line inspection (ILI) and the hydrostatic pressure test for the purpose of managing the integrity of high pressure pipelines. Accordingly an ECDA standard is now in existence. The essence of ECDA is to use indirect above ground survey techniques to locate the presence of coating and corrosion defects and then to investigate some of the indications directly by making excavations. However, one of the problems of above ground survey techniques is that they do not locate all defects and are susceptible to false indication. This means that the defects will not be present at all indications and that some defects will be missed. In view of the limitations of above ground survey techniques the ECDA standard requires that at least two complimentary survey techniques should be used. The selected survey techniques will depend on the nature of a particular ‘ECDA segment’, taking account of the surface characteristics. However, in many situations the surveys will include a coating survey and a corrosion survey. In general the outcome from these two surveys will be NH locations at which just the coating survey gives an indication, NC locations at which just the corrosion survey gives an indication and NHC locations at which both surveys give an indication. This paper presents a new probabilistic methodology for estimating the distributions of the actual numbers of coating and corrosion defects, taking account of the outcomes of the surveys and the probabilities of detection and false indication of both techniques. The method also shows how the probabilities of detection and false indication are updated depending on what is found during the excavations and the distributions of the numbers of remaining corrosion and coating defects are subsequently modified. Based on a prescribed repair criterion the analysis is used to determine the probability that at least one remaining corrosion defect will exceed the repair criteria. As excavations are sequentially performed the probability naturally reduces. The attainment of an acceptably low probability is used as a trigger to terminate the excavation programme. A detailed description of the development of the method is given in this paper and the application is illustrated through a simple numerical example. A description of how the method is used to build a Direct Assessment module for a pipeline integrity management system is described in an accompanying paper.


Hematology ◽  
2021 ◽  
Vol 2021 (1) ◽  
pp. 673-681
Author(s):  
Alissa Visram ◽  
Joselle Cook ◽  
Rahma Warsame

Abstract The adage for smoldering myeloma (SMM) has been to observe without treatment, until criteria for active multiple myeloma were satisfied. Definitions and risk stratification models have become more sophisticated, with prognostication tailored to include high-risk cytogenetics as per the most recent International Myeloma Working Group 2020 risk model. Moreover, progress in defining genomic evolution and changes in the bone marrow microenvironment through the monoclonal continuum have given insight into the complexities underlying the different patterns of progression observed in SMM. Given recent data showing improved progression-free survival with early intervention in high-risk SMM, the current dilemma is focused on how these patients should be treated. This case-based article maps the significant advancements made in the diagnosis and risk stratification of SMM. Data from landmark clinical trials will also be discussed, and ongoing trials are summarized. Ultimately, we outline our approach to SMM and hope to impart to the reader a sound concept of the current clinical management of SMM.


2011 ◽  
Vol 5 (3) ◽  
pp. 383-399
Author(s):  
Leisha Jones

I appropriate Deleuze and Guattari's concept of the refrain for a feminist analysis of the girl because it offers more insight into the ways girls construct themselves as performative networks than the death-by-culture or at-risk model preferred by such feminists as Jean Kilbourne, Carol Gilligan, and even Susan Bordo. I proffer that it costs women everything to practise a politics of difference that is by definition reactionary, a reaction to the cultural refusal of leaky gendered bodies that must be overcome. Girl is mapped through such alternations as powerful aggregates of the tremulous, roaming emissions of monstrous particles most desired and desirous.


Author(s):  
Jane Dawson ◽  
Iain Colquhoun ◽  
Inessa Yablonskikh ◽  
Russell Wenz ◽  
Tuan Nguyen

Current risk assessment practice in pipeline integrity management tends to use semi-quantitative index-based or model-based methodologies. This approach has been found to be very flexible and provide useful results for identifying high-risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability-based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provide greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach to suit the operator’s data availability and quality, and analysis needs. The paper also discusses experiences of implementing this type of risk model in Pipeline Integrity Management System (PIMS) software and the use of and integration of data via existing pipeline geographical information systems (GIS).


2007 ◽  
Vol 362 (1486) ◽  
pp. 1831-1839 ◽  
Author(s):  
Christoph Flamm ◽  
Lukas Endler ◽  
Stefan Müller ◽  
Stefanie Widder ◽  
Peter Schuster

A self-consistent minimal cell model with a physically motivated schema for molecular interaction is introduced and described. The genetic and metabolic reaction network of the cell is modelled by multidimensional nonlinear ordinary differential equations, which are derived from biochemical kinetics. The strategy behind this modelling approach is to keep the model sufficiently simple in order to be able to perform studies on evolutionary optimization in populations of cells. At the same time, the model should be complex enough to handle the basic features of genetic control of metabolism and coupling to environmental factors. Thereby, the model system will provide insight into the mechanisms leading to important biological phenomena, such as homeostasis, (circadian) rhythms, robustness and adaptation to a changing environment. One example of modelling a molecular regulatory mechanism, cooperative binding of transcription factors, is discussed in detail.


Sign in / Sign up

Export Citation Format

Share Document