Modeling Production Facilities Using Conventional Process Simulators and Data Validation and Reconciliation DVR Methodology

2021 ◽  
Author(s):  
Tom Mooney ◽  
Kelda Bratley ◽  
Amin Amin ◽  
Timothy Jadot

Abstract The use of conventional process simulators is commonplace for system design and is growing in use for online monitoring and optimization applications. While these simulators are extremely useful, additional value can be extracted by combining simulator predictions with field inputs from measurement devices such as flowmeters, pressure and temperature sensors. The statistical nature of inputs (e.g., measurement uncertainty) are typically not considered in the forward calculations performed by the simulators and so may lead to erroneous results if the actual raw measurement is in error or biased. A complementary modeling methodology is proposed to identify and correct measurement and process errors as an integral part of a robust simulation practice. The studied approach ensures best quality data for direct use in the process models and simulators for operations and process surveillance. From a design perspective, this approach also makes it possible to evaluate the impact of uncertainty of measured and unmeasured variables on CAPEX spend and optimize instrument / meter design. In this work, an extended statistical approach to process simulation is examined using Data Validation and Reconciliation, (DVR). The DVR methodology is compared to conventional non-statistical, deterministic process simulators. A key difference is that DVR uses any measured variable (inlet, outlet, or in between measurements), including its uncertainty, in the modelled process as an input, where only inlet measurement values are used by traditional simulators to estimate the values of all other measured and unmeasured variables. A walk through the DVR calculations and applications is done using several comparative case studies of a typical surface process facility. Examples are the simulation of commingled multistage oil and gas separation process, the validation of separators flowmeters and fluids samples, and the quantification of unmeasured variables along with their uncertainties. The studies demonstrate the added value from using redundancy from all available measurements in a process model based on the DVR method. Single points and data streaming field cases highlight the dependency and complementing roles of traditional simulators, and data validation provided by the DVR methodology; it is shown how robust measurement management strategies can be developed based on DVR's effective surveillance capabilities. Moreover, the cases demonstrate how DVR-based capex and opex improvements are derived from effective hardware selection using cost versus measurement precision trade-offs, soft measurements substitutes, and from condition-based maintenance strategies.


Author(s):  
Poovadol Sirirangsi ◽  
Adjo Amekudzi ◽  
Pannapa Herabat

The replacement-cost approach and the book-value method as decision support tools for selecting maintenance alternatives under budget constraints and for capturing the effects of maintenance practices on highway asset value are investigated. By using a case study based on the Thailand Pavement Management System, the replacement-cost approach and the book-value method are applied to analyze maintenance alternatives for selected highways. The versatility of these asset-valuation methods is explored for capturing trade-offs in the type and timing of maintenance and for incorporating the added value of effective maintenance practices and the impact of deferred maintenance in the overall asset value. The study demonstrated that the replacement-cost approach is a more versatile tool for considering the maintenance-related value of highways in maintenance decision making, whereas the book value may be a simpler financial accounting tool. The two approaches may be used together to clarify how maintenance expenditures are being translated into facility replacement value or how the overall value of the infrastructure is being preserved. The study results are potentially useful to agencies interested in capturing the added value of effective maintenance practices in the overall value of their asset base.



2015 ◽  
Vol 15 (20) ◽  
pp. 28749-28792 ◽  
Author(s):  
A. J. Prenni ◽  
D. E. Day ◽  
A. R. Evanoski-Cole ◽  
B. C. Sive ◽  
A. Hecobian ◽  
...  

Abstract. The Bakken formation contains billions of barrels of oil and gas trapped in rock and shale. Horizontal drilling and hydraulic fracturing methods have allowed for extraction of these resources, leading to exponential growth of oil production in the region over the past decade. Along with this development has come an increase in associated emissions to the atmosphere. Concern about potential impacts of these emissions on federal lands in the region prompted the National Park Service to sponsor the Bakken Air Quality Study over two winters in 2013–2014. Here we provide an overview of the study and present some initial results aimed at better understanding the impact of local oil and gas emissions on regional air quality. Data from the study, along with long term monitoring data, suggest that while power plants are still an important emissions source in the region, emissions from oil and gas activities are impacting ambient concentrations of nitrogen oxides and black carbon and may dominate recent observed trends in pollutant concentrations at some of the study sites. Measurements of volatile organic compounds also definitively show that oil and gas emissions were present in almost every air mass sampled over a period of more than four months.



2021 ◽  
Vol 26 (4) ◽  
pp. 15-27
Author(s):  
Alexander Miller ◽  
Maxim Miller

Issues of scientific and technological development of the economy, increasing its competitiveness, including various aspects of technological integration, are the subject of foreign and domestic research. At the same time, technological integration is considered as a key direction of the new industrial and scientific-technical policy, as a means of transition to the digital economy, to production processes with higher added value, as a means of establishing a constructive dialogue between industrial enterprises and science. The reason for this is, on the one hand, the relative novelty of this economic phenomenon, and, on the other, the lack of theoretical and methodological tools for modelling the development of technological integration. The purpose of the article is to study the problems of modelling the development of technological integration in the context of priority scientific and technological development of the Russian economy. The article uses a wide range of general scientific methods: analysis and synthesis, grouping, typing, modelling, economicstatistical and graphical. The main methodological approaches used in the article are: structural-functional, instrumental and process approaches, which are reflected in the scientific and practical material of the general theory of systems, the theory of organization. The theoretical results of the study are the disclosure of the organizational model for the development of technological integration as a dynamic set of interconnected modules: management and coordination; structure; processes; resources designated to achieve the strategic objectives of technology integration participants. Classification characteristics of technological integration development processes have been identified and theoretically justified. The applied result is a specialized modelling tool based on a combination of a standardized approach and improved design quality with the ability to test simulated processes and the presence of stable feedback with all participants in technological integration. The process model of technological integration development was argued, its decomposition was carried out, which allows distinguishing the main, supporting and regulatory processes of participants in technological integration. The combination of these models facilitates the management of these processes in order to maximize the efficiency of the modern economy. An organizational and economic mechanism for modelling the development of technological integration is proposed, which allows the use of operational monitoring, due to the vector orientation of which it becomes possible to promptly carry out regular adjustments of key parameters of assessing the impact of technological integration on the results of technological development in national economies.



2019 ◽  
pp. 243-258 ◽  
Author(s):  
Denys Yemshanov ◽  
Robert G. Haight ◽  
Ning Liu ◽  
Marc-André Parisien ◽  
Quinn Barber ◽  
...  

Protecting wildlife within areas of resource extraction often involves reducing habitat fragmentation. In Canada, protecting threatened woodland caribou (Rangifer tarandus caribou (Gmelin, 1788)) populations requires preserving large areas of intact forest habitat, with some restrictions on industrial forestry activities. We present a linear programming model that assesses the trade-off between achieving an objective of habitat protection for caribou populations while maintaining desired levels of harvest in forest landscapes. The habitat-protection objective maximizes the amount of connected habitat that is accessible by caribou, and the forestry objective maximizes net revenues from timber harvest subject to even harvest flow, a harvest target, and environmental sustainability constraints. We applied the model to explore the habitat protection and harvesting scenarios in the Cold Lake caribou range, a 6726 km2 area of prime caribou habitat in Alberta, Canada. We evaluated harvest scenarios ranging from 0.1 Mm3·year–1 to maximum sustainable harvest levels over 0.7 Mm3·year–1 and assessed the impact of habitat protection measures on timber supply costs. Protecting caribou habitat by deferring or reallocating harvest increases the timber unit cost by Can$1.1–2.0 m–3. However, this impact can be partially mediated by extending the harvest to areas of oil and gas extraction to offset forgone harvest in areas of prime caribou habitat.



Author(s):  
Richta C. IJntema ◽  
Wilmar B. Schaufeli ◽  
Yvonne D. Burger

AbstractRecently, scientists have shifted their focus from studying psychological resilience as a single, isolated construct (e.g. attribute or outcome) to studying it as a dynamic process encompassing a number of temporally related elements. Models depicting this process explain why some people adapt to stressor exposure, whereas others do not. To date, these process models did not sufficiently explain how people adapt differently to a stressor. To address this issue, we developed a new model of psychological resilience, called the Psychological Immunity-Psychological Elasticity (PI-PE) model. The aim of this article is to clarify this model and to discuss its added value. First, we explain how we derived the PI-PE model from the literature regarding both the crucial elements in any resilience process model and the (mal)adaptive outcomes following stressful events. Secondly, we describe the different elements that make up the model. Characteristic of the PI-PE model is that it distinguishes between two pathways of psychological resilience – psychological immunity and psychological elasticity – with four adaptive outcomes, namely sustainability, recovery, transformation and thriving. To explain how people arrive at these different outcomes, we argue that two consecutive mechanisms are critical in these pathways: tolerance and narrative construction. Taken as a whole, the PI-PE model presents a comprehensive framework to inspire both research and practice. It explains how the process of psychological resilience works differently for different people and how to support individuals in their process towards successfully and differently adapting to stressors.



Author(s):  
Stefan Zugal ◽  
Cornelia Haisjackl ◽  
Jakob Pinggera ◽  
Barbara Weber

Declarative approaches to process modeling are regarded well suited for highly volatile environments as they provide a high degree of flexibility. However, problems in understanding and maintaining declarative process models impede their usage. To compensate for these shortcomings, Test Driven Modeling (TDM) has been proposed. This paper reports on an empirical investigation in which TDM is viewed from two different angles. First, the impact of TDM on communication is explored in a case study. Results indicate that domain experts are inclined to use test cases for communicating with the model builder (system analyst) and prefer them over the process model. The second part of the investigation, a controlled experiment, investigates the impact of TDM on process model maintenance. Data gathered in this experiment indicates that the adoption of test cases significantly lowers cognitive load and increases the perceived quality of changes.



1993 ◽  
Vol 18 (1) ◽  
pp. 5-21 ◽  
Author(s):  
Norris Krueger

Shapero (1975, 1982) proposed an Intentionality-based process model of the entrepreneurial event. Entrepreneurial intentions should derive from feasibility and desirability perceptions plus a propensity to act on opportunities. Prior entrepreneurship-related experiences should influence entrepreneurial intentions indirectly through these perceptions. Path analyses found that feasibility and desirability perceptions and propensity to act each proved significant antecedents of entrepreneurial intentions. Perceived feasibility was significantly associated with the breadth of prior exposure. Perceived desirability was significantly associated with the positiveness of that prior exposure. Strong support was found for Shapero's model, arguing for further application of intentions-based process models of entrepreneurial activity.



10.2196/15374 ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. e15374 ◽  
Author(s):  
Michael Winter ◽  
Rüdiger Pryss ◽  
Thomas Probst ◽  
Manfred Reichert

Background The management and comprehension of business process models are of utmost importance for almost any enterprise. To foster the comprehension of such models, this paper has incorporated the idea of a serious game called Tales of Knightly Process. Objective This study aimed to investigate whether the serious game has a positive, immediate, and follow-up impact on process model comprehension. Methods A total of two studies with 81 and 64 participants each were conducted. Within the two studies, participants were assigned to a game group and a control group (ie, study 1), and a follow-up game group and a follow-up control group (ie, study 2). A total of four weeks separated study 1 and study 2. In both studies, participants had to answer ten comprehension questions on five different process models. Note that, in study 1, participants in the game group played the serious game before they answered the comprehension questions to evaluate the impact of the game on process model comprehension. Results In study 1, inferential statistics (analysis of variance) revealed that participants in the game group showed a better immediate performance compared to control group participants (P<.001). A Hedges g of 0.77 also indicated a medium to large effect size. In study 2, follow-up game group participants showed a better performance compared to participants from the follow-up control group (P=.01); here, a Hedges g of 0.82 implied a large effect size. Finally, in both studies, analyses indicated that complex process models are more difficult to comprehend (study 1: P<.001; study 2: P<.001). Conclusions Participants who played the serious game showed better performance in the comprehension of process models when comparing both studies.



2019 ◽  
Author(s):  
William Finnigan ◽  
Rhys Cutlan ◽  
Radka Snajdrova ◽  
Joseph P. Adams ◽  
Jennifer A. Littlechild ◽  
...  

AbstractMulti-step enzyme reactions offer considerable cost and productivity benefits. Process models offer a route to understanding the complexity of these reactions, and allow for their optimization. Despite the increasing prevalence of multi-step biotransformations, there are few examples of process models for enzyme reactions. From a toolbox of characterized enzyme parts, we demonstrate the construction of a process model for a seven enzyme, three step biotransformation using isolated enzymes. Enzymes for cofactor regeneration were employed to make thisin vitroreaction economical. Good modelling practice was critical in evaluating the impact of approximations and experimental error. We show that the use and validation of process models was instrumental in realizing and removing process bottlenecks, identifying divergent behavior, and for the optimization of the entire reaction using a genetic algorithm. We validated the optimized reaction to demonstrate that complex multi-step reactions with cofactor recycling involving at least seven enzymes can be reliably modelled and optimized.Significance statementThis study examines the challenge of modeling and optimizing multi-enzyme cascades. We detail the development, testing and optimization of a deterministic model of a three enzyme cascade with four cofactor regeneration enzymes. Significantly, the model could be easily used to predict the optimal concentrations of each enzyme in order to get maximum flux through the cascade. This prediction was strongly validated experimentally. The success of our model demonstrates that robust models of systems of at least seven enzymes are readily achievable. We highlight the importance of following good modeling practice to evaluate model quality and limitations. Examining deviations from expected behavior provided additional insight into the model and enzymes. This work provides a template for developing larger deterministic models of enzyme cascades.



2019 ◽  
Author(s):  
Michael Winter ◽  
Rüdiger Pryss ◽  
Thomas Probst ◽  
Manfred Reichert

BACKGROUND The management and comprehension of business process models are of utmost importance for almost any enterprise. To foster the comprehension of such models, this paper has incorporated the idea of a serious game called Tales of Knightly Process. OBJECTIVE This study aimed to investigate whether the serious game has a positive, immediate, and follow-up impact on process model comprehension. METHODS A total of two studies with 81 and 64 participants each were conducted. Within the two studies, participants were assigned to a game group and a control group (ie, study 1), and a follow-up game group and a follow-up control group (ie, study 2). A total of four weeks separated study 1 and study 2. In both studies, participants had to answer ten comprehension questions on five different process models. Note that, in study 1, participants in the game group played the serious game before they answered the comprehension questions to evaluate the impact of the game on process model comprehension. RESULTS In study 1, inferential statistics (analysis of variance) revealed that participants in the game group showed a better immediate performance compared to control group participants (<italic>P</italic>&lt;.001). A Hedges g of 0.77 also indicated a medium to large effect size. In study 2, follow-up game group participants showed a better performance compared to participants from the follow-up control group (<italic>P</italic>=.01); here, a Hedges g of 0.82 implied a large effect size. Finally, in both studies, analyses indicated that complex process models are more difficult to comprehend (study 1: <italic>P</italic>&lt;.001; study 2: <italic>P</italic>&lt;.001). CONCLUSIONS Participants who played the serious game showed better performance in the comprehension of process models when comparing both studies.



Sign in / Sign up

Export Citation Format

Share Document