scholarly journals Hybrid Process Models in Electrochemical Syntheses under Deep Uncertainty

Processes ◽  
2021 ◽  
Vol 9 (4) ◽  
pp. 704
Author(s):  
Fenila Francis-Xavier ◽  
Fabian Kubannek ◽  
René Schenkendorf

Chemical process engineering and machine learning are merging rapidly, and hybrid process models have shown promising results in process analysis and process design. However, uncertainties in first-principles process models have an adverse effect on extrapolations and inferences based on hybrid process models. Parameter sensitivities are an essential tool to understand better the underlying uncertainty propagation and hybrid system identification challenges. Still, standard parameter sensitivity concepts may fail to address comprehensive parameter uncertainty problems, i.e., deep uncertainty with aleatoric and epistemic contributions. This work shows a highly effective and reproducible sampling strategy to calculate simulation uncertainties and global parameter sensitivities for hybrid process models under deep uncertainty. We demonstrate the workflow with two electrochemical synthesis simulation studies, including the synthesis of furfuryl alcohol and 4-aminophenol. Compared with Monte Carlo reference simulations, the CPU-time was significantly reduced. The general findings of the hybrid model sensitivity studies under deep uncertainty are twofold. First, epistemic uncertainty has a significant effect on uncertainty analysis. Second, the predicted parameter sensitivities of the hybrid process models add value to the interpretation and analysis of the hybrid models themselves but are not suitable for predicting the real process/full first-principles process model’s sensitivities.

2013 ◽  
Vol 18 (7) ◽  
pp. 1393-1403 ◽  
Author(s):  
Julie Clavreul ◽  
Dominique Guyonnet ◽  
Davide Tonini ◽  
Thomas H. Christensen

2016 ◽  
Vol 8 (1) ◽  
pp. 21-32 ◽  
Author(s):  
Patrik Söderberg ◽  
Kaj Bjorkqvist ◽  
Karin Österman

Purpose – Recent studies indicate that exposure to physical punishment is associated with both aggressive behavior and peer victimization at school. The purpose of this paper is to explore the bidirectional relationship between aggressive behavior and peer victimization as outcomes of physical punishment, as well as the role of depressive symptoms. Design/methodology/approach – A sample of 2,424 Finnish upper primary school pupils (1,282 girls, 1,148 boys, mean age=14.2, SD=1.0) completed an online survey during class. Two conditional process models were applied using a macro for SPSS developed by Hayes (2012). Findings – Exposure to physical punishment was found to be associated with both aggressive behavior and peer victimization at school. The effect on victimization was partially mediated by aggressive behavior and depressive symptoms, whereas the effect on aggressive behavior was partially mediated by peer victimization experiences but not by depressive symptoms. The relationship between physical punishment and peer victimization was somewhat stronger for girls than for boys, but this effect was not accounted for by gender differences in depressive symptoms or aggressive behavior. Originality/value – Few studies to date have addressed the connection between aggressive behavior and peer victimization as outcomes of physical punishment. In addition, the study expands on the concept of “victim personality” by examining the mediating role of depressive symptoms. Methodologically, the study is an example of how the statistical software SPSS can be used for multiple mediation and conditional process analysis as an alternative to SEM analyses.


Author(s):  
Alessandra Cuneo ◽  
Alberto Traverso ◽  
Shahrokh Shahpar

In engineering design, uncertainty is inevitable and can cause a significant deviation in the performance of a system. Uncertainty in input parameters can be categorized into two groups: aleatory and epistemic uncertainty. The work presented here is focused on aleatory uncertainty, which can cause natural, unpredictable and uncontrollable variations in performance of the system under study. Such uncertainty can be quantified using statistical methods, but the main obstacle is often the computational cost, because the representative model is typically highly non-linear and complex. Therefore, it is necessary to have a robust tool that can perform the uncertainty propagation with as few evaluations as possible. In the last few years, different methodologies for uncertainty propagation and quantification have been proposed. The focus of this study is to evaluate four different methods to demonstrate strengths and weaknesses of each approach. The first method considered is Monte Carlo simulation, a sampling method that can give high accuracy but needs a relatively large computational effort. The second method is Polynomial Chaos, an approximated method where the probabilistic parameters of the response function are modelled with orthogonal polynomials. The third method considered is Mid-range Approximation Method. This approach is based on the assembly of multiple meta-models into one model to perform optimization under uncertainty. The fourth method is the application of the first two methods not directly to the model but to a response surface representing the model of the simulation, to decrease computational cost. All these methods have been applied to a set of analytical test functions and engineering test cases. Relevant aspects of the engineering design and analysis such as high number of stochastic variables and optimised design problem with and without stochastic design parameters were assessed. Polynomial Chaos emerges as the most promising methodology, and was then applied to a turbomachinery test case based on a thermal analysis of a high-pressure turbine disk.


2021 ◽  
Vol 27 (2) ◽  
pp. 638-657
Author(s):  
Fredrik Milani ◽  
Luciano Garcia-Banuelos ◽  
Svitlana Filipova ◽  
Mariia Markovska

PurposeBlockchain technology is increasingly positioned as a promising and disruptive technology. Such a promise has attracted companies to explore how blockchain technology can be used to gain significant benefits. Process models play a cardinal role when seeking to improve business processes as they are the foundation of process analysis and redesign. This paper examines how blockchain-oriented processes can be conceptually modelled with activity- (BPMN) and artifact-centric (CMMN) modelling paradigms.Design/methodology/approachThis paper discusses how commonly occurring patterns, specific to block-chain-based applications, can be modelled with BPMN and CMMN. Furthermore, the advantages and disadvantages of both notations for accurately representing blockchain-specific patterns are discussed.FindingsThe main finding of this paper is that neither BPMN nor CMMN can adequately and accurately represent certain patterns specific for blockchain-oriented processes. BPMN, while supporting most of the patterns, does not provide sufficient support to represent tokenization. CMMN, on the other hand, does not provide support to distinguish between activities executed and data stored on-chain versus off-chain.Originality/valueThe paper provides insight into the strengths and weaknesses of BPMN and CMMN for modelling processes to be supported by blockchain. This will serve to aid analysts to produce better process models for communication purposes and, thereby, facilitate development of blockchain-based solutions.


Author(s):  
Tuğba Gürgen ◽  
Ayça Tarhan ◽  
N. Alpay Karagöz

The verification of process implementations according to specifications is a critical step of process management. This verification must be practiced according to objective criteria and evidence. This study explains an integrated infrastructure that utilizes process mining for software process verification and case studies carried out by using this infrastructure. Specific software providing the utilization of process mining algorithms for software process verification is developed as a plugin to an open-source EPF Composer tool that supports the management of software and system engineering processes. With three case studies, bug management, task management, and defect management processes are verified against defined and established process models (modeled by using EPF Composer) by using this plugin over real process data. Among these, the results of the case study performed in a large, leading IT solutions company in Turkey are remarkable in demonstrating the opportunities for process improvement.


Author(s):  
Wil M. P. van der Aalst ◽  
Riccardo De Masellis ◽  
Chiara Di Francescomarino ◽  
Chiara Ghidini

2017 ◽  
Vol 5 (1) ◽  
pp. 21-46 ◽  
Author(s):  
Daniel E. J. Hobley ◽  
Jordan M. Adams ◽  
Sai Siddhartha Nudurupati ◽  
Eric W. H. Hutton ◽  
Nicole M. Gasparini ◽  
...  

Abstract. The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures – including both regular and irregular grids – to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.


2011 ◽  
Vol 133 (2) ◽  
Author(s):  
Kais Zaman ◽  
Mark McDonald ◽  
Sankaran Mahadevan

This paper develops and illustrates a probabilistic approach for uncertainty representation and propagation in system analysis, when the information on the uncertain input variables and/or their distribution parameters may be available as either probability distributions or simply intervals (single or multiple). A unique aggregation technique is used to combine multiple interval data and to compute rigorous bounds on the system response cumulative distribution function. The uncertainty described by interval data is represented through a flexible family of probability distributions. Conversion of interval data to a probabilistic format enables the use of computationally efficient methods for probabilistic uncertainty propagation. Two methods are explored for the implementation of the proposed approach, based on (1) sampling and (2) optimization. The sampling-based strategy is more expensive and tends to underestimate the output bounds. The optimization-based methodology improves both aspects. The proposed methods are used to develop new solutions to challenge problems posed by the Sandia epistemic uncertainty workshop (Oberkampf et al., 2004, “Challenge Problems: Uncertainty in System Response Given Uncertain Parameters,” Reliab. Eng. Syst. Saf., 85, pp. 11–19). Results for the challenge problems are compared with earlier solutions.


2019 ◽  
Author(s):  
Scott Strobel ◽  
Lucille Knowles ◽  
Nitin Nitin ◽  
Herbert Scher ◽  
Tina Jeoh

<div>The food, chemical, and biotechnology industries offer many potential applications for calcium alginate microencapsulation, but this technique is largely confined to the laboratory bench due to scalability challenges. Scaling up the traditional external gelation method requires several costly unit operations. Alternatively, a consolidated process accomplishes alginate cross-linking in situ during spray-drying to form cross-linked alginate microcapsules (‘the CLAMs process’). This work examined the process economics of these two microencapsulation processes through technoeconomic analysis. Parallel batch process models were constructed in SuperPro Designer, initially for encapsulating emulsified fish oil. At all production scales examined, the capital investment and annual operating cost were lower for the CLAMs process. Modifying the external gelation process marginally improved the process economics, but costs remained elevated. The CLAMs process’ economic advantage stemmed from reducing the number of unit procedures, which lowered the equipment purchase cost and the dependent components of capital investment and annual operating cost. Upon modifying the models for microencapsulating hydrophilic cargo (e.g. enzymes, vitamins, microbial concentrates), the CLAMs process remained favorable at all cargo material costs and cargo loadings examined. This work demonstrates the utility of technoeconomic analysis for evaluating microencapsulation processes and may justify applying the CLAMs process at the industrial scale. </div>


2016 ◽  
Author(s):  
Daniel E. J. Hobley ◽  
Jordan M. Adams ◽  
Sai Siddhartha Nudurupati ◽  
Eric W. H. Hutton ◽  
Nicole M. Gasparini ◽  
...  

Abstract. The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing: (1) a set of tools and existing grid structures – including both regular and irregular grids – to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications, but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.


Sign in / Sign up

Export Citation Format

Share Document