scholarly journals Multi-Period Dynamic Optimization for Large-Scale Differential-Algebraic Process Models under Uncertainty

Processes ◽  
2015 ◽  
Vol 3 (3) ◽  
pp. 541-567 ◽  
Author(s):  
Ian Washington ◽  
Christopher Swartz
Author(s):  
Irina Gaus ◽  
Klaus Wieczorek ◽  
Juan Carlos Mayor ◽  
Thomas Trick ◽  
Jose´-Luis Garcia` Sin˜eriz ◽  
...  

The evolution of the engineered barrier system (EBS) of geological repositories for radioactive waste has been the subject of many research programmes during the last decade. The emphasis of the research activities was on the elaboration of a detailed understanding of the complex thermo-hydro-mechanical-chemical processes, which are expected to evolve in the early post closure period in the near field. It is important to understand the coupled THM-C processes and their evolution occurring in the EBS during the early post-closure phase so it can be confirmed that the safety functions will be fulfilled. Especially, it needs to be ensured that interactions during the resaturation phase (heat pulse, gas generation, non-uniform water uptake from the host rock) do not affect the performance of the EBS in terms of its safety-relevant parameters (e.g. swelling pressure, hydraulic conductivity, diffusivity). The 7th Framework PEBS project (Long Term Performance of Engineered Barrier Systems) aims at providing in depth process understanding for constraining the conceptual and parametric uncertainties in the context of long-term safety assessment. As part of the PEBS project a series of laboratory and URL experiments are envisaged to describe the EBS behaviour after repository closure when resaturation is taking place. In this paper the very early post-closure period is targeted when the EBS is subjected to high temperatures and unsaturated conditions with a low but increasing moisture content. So far the detailed thermo-hydraulic behaviour of a bentonite EBS in a clay host rock has not been evaluated at a large scale in response to temperatures of up to 140°C at the canister surface, produced by HLW (and spent fuel), as anticipated in some of the designs considered. Furthermore, earlier THM experiments have shown that upscaling of thermal conductivity and its dependency on water content and/or humidity from the laboratory scale to a field scale needs further attention. This early post-closure thermal behaviour will be elucidated by the HE-E experiment, a 1:2 scale heating experiment setup at the Mont Terri rock laboratory, that started in June 2011. It will characterise in detail the thermal conductivity at a large scale in both pure bentonite as well as a bentonite-sand mixture, and in the Opalinus Clay host rock. The HE-E experiment is especially designed as a model validation experiment at the large scale and a modelling programme was launched in parallel to the different experimental steps. Scoping calculations were run to help the experimental design and prediction exercises taking the final design into account are foreseen. Calibration and prediction/validation will follow making use of the obtained THM dataset. This benchmarking of THM process models and codes should enhance confidence in the predictive capability of the recently developed numerical tools. It is the ultimate aim to be able to extrapolate the key parameters that might influence the fulfilment of the safety functions defined for the long term steady state.


2021 ◽  
Vol 9 (1) ◽  
Author(s):  
Moritz Mercker ◽  
Philipp Schwemmer ◽  
Verena Peschko ◽  
Leonie Enners ◽  
Stefan Garthe

Abstract Background New wildlife telemetry and tracking technologies have become available in the last decade, leading to a large increase in the volume and resolution of animal tracking data. These technical developments have been accompanied by various statistical tools aimed at analysing the data obtained by these methods. Methods We used simulated habitat and tracking data to compare some of the different statistical methods frequently used to infer local resource selection and large-scale attraction/avoidance from tracking data. Notably, we compared spatial logistic regression models (SLRMs), spatio-temporal point process models (ST-PPMs), step selection models (SSMs), and integrated step selection models (iSSMs) and their interplay with habitat and animal movement properties in terms of statistical hypothesis testing. Results We demonstrated that only iSSMs and ST-PPMs showed nominal type I error rates in all studied cases, whereas SSMs may slightly and SLRMs may frequently and strongly exceed these levels. iSSMs appeared to have on average a more robust and higher statistical power than ST-PPMs. Conclusions Based on our results, we recommend the use of iSSMs to infer habitat selection or large-scale attraction/avoidance from animal tracking data. Further advantages over other approaches include short computation times, predictive capacity, and the possibility of deriving mechanistic movement models.


Author(s):  
Ralf Knauss ◽  
Lukas E. Wiesegger ◽  
Rolf Marr ◽  
Ju¨rgen J. Brandner

Arranging micro-structured equipment to plants whole production processes can be realized with maximum efficiency in tightest space. Unit operations are thereby represented as individual functional modules in shape of micro devices. In a multi unit operation plant a correspondingly large number of manipulable variables have to be coordinated. Due to the design of micro-scaled devices plants form sophisticated systems, while for a fully optimized control still no common satisfying solutions exist. A system of modular, discontinuous phase contacting, micro rectification consists of unit operations heating, cooling, mixing and separating. Heat exchangers, mixers and cyclones for phase separation can be arranged to a counter-current rectification system with maximum mass-transfer efficiency every unit. Operating an electrical heated evaporator for modular rectification purposes a strong coupling of mass flow with the vapor fraction and the outlet temperature can be observed [4]. Operating at a predefined state for mass flow, temperature and vapor fraction may only be possible with difficulties using traditional methods of linear control technology. For dynamic optimization of the multivariable micro-structured evaporator principle of Nonlinear Model Predictive Control (NMPC) was generically formulated in C++ and implemented to LABVIEW 7. Every discrete time step an objective function is generated from nonlinear process models in the form of grouped NARX-polynomials. Optimal sequences of control actions for plant operation are evolved. The resulting constrained cost function is non-convex making detection of relative local optimum a difficult task. This obstacle can be gone around using heuristic optimization algorithm in combination with traditional techniques. Based on experimental results it was demonstrated that NMPC keeps the coupled variables mass flow and temperature energy saving with minimal control activity in the entire two-phase region on their set-points.


2016 ◽  
Vol 20 (suppl. 1) ◽  
pp. 59-67 ◽  
Author(s):  
María Erans ◽  
Dawid Hanak ◽  
Jordi Mir ◽  
Edward Anthony ◽  
Vasilije Manovic

Calcium looping (CaL) is promising for large-scale CO2 capture in the power generation and industrial sectors due to the cheap sorbent used and the relatively low energy penalties achieved with this process. Because of the high operating temperatures the heat utilisation is a major advantage of the process, since a significant amount of power can be generated from it. However, this increases its complexity and capital costs. Therefore, not only the energy efficiency performance is important for these cycles, but also the capital costs must be taken into account, i.e. techno-economic analyses are required in order to determine which parameters and configurations are optimal to enhance technology viability in different integration scenarios. In this study the integration scenarios of CaL cycles and natural gas combined cycles (NGCC) are explored. The process models of the NGCC and CaL capture plant are developed to explore the most promising scenarios for NGCC-CaL integration with regards to efficiency penalties. Two scenarios are analysed in detail, and show that the system with heat recovery steam generator (HRSG) before and after the capture plant exhibited better performance of 49.1% efficiency compared with that of 45.7% when only one HRSG is located after the capture plant. However, the techno-economic analyses showed that the more energy efficient case, with two HRSGs, implies relatively higher cost of electricity (COE), 44.1?/MWh, when compared to that of the reference plant system (33.1?/MWh). The predicted cost of CO2 avoided for the case with two HRSGS is 29.3 ?/ton CO2.


2013 ◽  
Author(s):  
Jerald E. Jones ◽  
Valerie L. Rhoades ◽  
Mark D. Mann

A DARPA Program a decade ago concentrated on flexible manufacturing – which included the development of manufacturing processes which were Direct CAD-to-Part, and “smart” robot/process controllers which had embedded process models and could “on-the-fly” make process adjustment and produce perfect or near perfect parts with no human intervention. A team, headed by two of the authors, began the development of the LITS-Form (Laser Induced Thermal Strain – Forming) process. The LITS-Form process has evolved, partly supported by both Navy and Department of Energy Small Business Innovative Research (SBIR)funding; and, the addition of induction heating has made the process much faster and more efficient for thicker metal. The process is similar to thermal forming using Line Heating – which many shipyards use to make complex 3D shapes. However, there are several significant advances in the LITS-Form technology. The Physics of the LITS-Form system is fundamentally different from manual Line Heating allowing it to operate up to 100 times as fast. LITS-Form can produce parts with greater accuracy than Line Heating, and is faster than conventional metal forming operations such as the linear brake press.


Author(s):  
O. Takaki ◽  
T. Seino ◽  
N. Izumi ◽  
K. Hasida

In agile software development, it is imperative for stakeholders such as the users and developers of an information system to collaborate in designing and developing the information system, by sharing their knowledge. Especially in development of a large-scale information system, such collaboration among stakeholders is important, but difficult to achieve. This chapter introduces a modeling method of business processes for requirements analysis and a development framework based on Web-process architectures. The modeling method makes it easier for stakeholders to agree upon requirements. It also employs a formal method to allow business process models to satisfy both understandability and accuracy. On the other hand, the development framework above enables rapid spiral development of short-term cycles through the collaboration of developers and users. This chapter also introduces an example that compares the workloads of two requirement analyses of large-scale system developments for a government service and a financial accounting service, in order to evaluate the advantages of the proposed modeling method.


2014 ◽  
pp. 1014-1035
Author(s):  
O. Takaki ◽  
T. Seino ◽  
N. Izumi ◽  
K. Hasida

In agile software development, it is imperative for stakeholders such as the users and developers of an information system to collaborate in designing and developing the information system, by sharing their knowledge. Especially in development of a large-scale information system, such collaboration among stakeholders is important, but difficult to achieve. This chapter introduces a modeling method of business processes for requirements analysis and a development framework based on Web-process architectures. The modeling method makes it easier for stakeholders to agree upon requirements. It also employs a formal method to allow business process models to satisfy both understandability and accuracy. On the other hand, the development framework above enables rapid spiral development of short-term cycles through the collaboration of developers and users. This chapter also introduces an example that compares the workloads of two requirement analyses of large-scale system developments for a government service and a financial accounting service, in order to evaluate the advantages of the proposed modeling method.


2020 ◽  
Vol 10 (4) ◽  
pp. 1493 ◽  
Author(s):  
Kwanghoon Pio Kim

In this paper, we propose an integrated approach for seamlessly and effectively providing the mining and the analyzing functionalities to redesigning work for very large-scale and massively parallel process models that are discovered from their enactment event logs. The integrated approach especially aims at analyzing not only their structural complexity and correctness but also their animation-based behavioral properness, and becomes concretized to a sophisticated analyzer. The core function of the analyzer is to discover a very large-scale and massively parallel process model from a process log dataset and to validate the structural complexity and the syntactical and behavioral properness of the discovered process model. Finally, this paper writes up the detailed description of the system architecture with its functional integration of process mining and process analyzing. More precisely, we excogitate a series of functional algorithms for extracting the structural constructs and for visualizing the behavioral properness of those discovered very large-scale and massively parallel process models. As experimental validation, we apply the proposed approach and analyzer to a couple of process enactment event log datasets available on the website of the 4TU.Centre for Research Data.


Sign in / Sign up

Export Citation Format

Share Document