scholarly journals The multi-assumption architecture and testbed (MAAT v1.0): R code for generating ensembles with dynamic model structure and analysis of epistemic uncertainty from multiple sources

2018 ◽  
Vol 11 (8) ◽  
pp. 3159-3185 ◽  
Author(s):  
Anthony P. Walker ◽  
Ming Ye ◽  
Dan Lu ◽  
Martin G. De Kauwe ◽  
Lianhong Gu ◽  
...  

Abstract. Computer models are ubiquitous tools used to represent systems across many scientific and engineering domains. For any given system, many computer models exist, each built on different assumptions and demonstrating variability in the ways in which these systems can be represented. This variability is known as epistemic uncertainty, i.e. uncertainty in our knowledge of how these systems operate. Two primary sources of epistemic uncertainty are (1) uncertain parameter values and (2) uncertain mathematical representations of the processes that comprise the system. Many formal methods exist to analyse parameter-based epistemic uncertainty, while process-representation-based epistemic uncertainty is often analysed post hoc, incompletely, informally, or is ignored. In this model description paper we present the multi-assumption architecture and testbed (MAAT v1.0) designed to formally and completely analyse process-representation-based epistemic uncertainty. MAAT is a modular modelling code that can simply and efficiently vary model structure (process representation), allowing for the generation and running of large model ensembles that vary in process representation, parameters, parameter values, and environmental conditions during a single execution of the code. MAAT v1.0 approaches epistemic uncertainty through sensitivity analysis, assigning variability in model output to processes (process representation and parameters) or to individual parameters. In this model description paper we describe MAAT and, by using a simple groundwater model example, verify that the sensitivity analysis algorithms have been correctly implemented. The main system model currently coded in MAAT is a unified, leaf-scale enzyme kinetic model of C3 photosynthesis. In the Appendix we describe the photosynthesis model and the unification of multiple representations of photosynthetic processes. The numerical solution to leaf-scale photosynthesis is verified and examples of process variability in temperature response functions are provided. For rapid application to new systems, the MAAT algorithms for efficient variation of model structure and sensitivity analysis are agnostic of the specific system model employed. Therefore MAAT provides a tool for the development of novel or toy models in many domains, i.e. not only photosynthesis, facilitating rapid informal and formal comparison of alternative modelling approaches.

2018 ◽  
Author(s):  
Anthony P. Walker ◽  
Ming Ye ◽  
Dan Lu ◽  
Martin G. De Kauwe ◽  
Lianhong Gu ◽  
...  

Abstract. Computer models are ubiquitous tools used to represent systems across many scientific and engineering domains. For any given system, many computer models exist, each built on different assumptions and demonstrating variability in the ways in which these systems can be represented. This variability is known as epistemic uncertainty, i.e. uncertainty in our knowledge of how these systems operate. Two primary sources of epistemic uncertainty are: 1) uncertain parameter values, and 2) uncertain mathematical representations of the processes that comprise the system. Many formal methods exist to analyse parameter-based epistemic uncertainty, while process-representation based epistemic uncertainty is often analysed informally and incompletely, or is ignored. In this model description paper we present the Multi-Assumption Architecture and Testbed (MAAT v1.0) designed to formally and more completely analyse process-representation based epistemic uncertainty. MAAT is a modular modelling code that can simply and efficiently vary model structure (process representation) during runtime, allowing the generation of large ensembles that vary in process representation, as well as in parameters, parameter values, and environmental conditions. MAAT v1.0 approaches epistemic uncertainty through sensitivity anlaysis, assigning variability in model output to processes (process representation and parameters) or to individual parameters. In this model description paper we describe MAAT and by using a simple groundwater model example, verify that the sensitivity analysis algorithms have been correctly implemented. The main system model currently coded in MAAT is a unified, leaf-scale enzyme kinetic model of C3 photosynthesis. We describe the photosynthesis model and the unification of multiple representations of photosynthetic processes. The numerical solution to leaf-scale photosynthesis is verified and examples of process variability in temperature response functions are provided. For rapid application to new systems, the MAAT algorithms for efficient variation of model structure and sensitivity analysis are agnostic of the specifc system model employed. Therefore MAAT provides a tool for development of novel or "toy" models in many domains, i.e. not only photosynthesis, facilitating rapid informal and formal comparison of alternative modelling approaches.


1989 ◽  
Vol 21 (4-5) ◽  
pp. 305-314
Author(s):  
J. P. Lumbers ◽  
S. C. Cook ◽  
G. A. Thomas

An application of a dynamic model of the activated sludge process is described within the context of real-time river basin management. The model has been calibrated and validated on independent data and then applied to investigate losses of nitrification at the Mogden Works. Monte Carlo simulation and generalised sensitivity analysis were found to be effective ways of identifying appropriate parameter values and their importance. The prediction of unmeasured states such as the autotroph population enabled the effects of alternative control actions to be better understood and the most suitable measures found.


2021 ◽  
Vol 9 (5) ◽  
pp. 467
Author(s):  
Mostafa Farrag ◽  
Gerald Corzo Perez ◽  
Dimitri Solomatine

Many grid-based spatial hydrological models suffer from the complexity of setting up a coherent spatial structure to calibrate such a complex, highly parameterized system. There are essential aspects of model-building to be taken into account: spatial resolution, the routing equation limitations, and calibration of spatial parameters, and their influence on modeling results, all are decisions that are often made without adequate analysis. In this research, an experimental analysis of grid discretization level, an analysis of processes integration, and the routing concepts are analyzed. The HBV-96 model is set up for each cell, and later on, cells are integrated into an interlinked modeling system (Hapi). The Jiboa River Basin in El Salvador is used as a case study. The first concept tested is the model structure temporal responses, which are highly linked to the runoff dynamics. By changing the runoff generation model description, we explore the responses to events. Two routing models are considered: Muskingum, which routes the runoff from each cell following the river network, and Maxbas, which routes the runoff directly to the outlet. The second concept is the spatial representation, where the model is built and tested for different spatial resolutions (500 m, 1 km, 2 km, and 4 km). The results show that the spatial sensitivity of the resolution is highly linked to the routing method, and it was found that routing sensitivity influenced the model performance more than the spatial discretization, and allowing for coarser discretization makes the model simpler and computationally faster. Slight performance improvement is gained by using different parameters’ values for each cell. It was found that the 2 km cell size corresponds to the least model error values. The proposed hydrological modeling codes have been published as open-source.


2018 ◽  
Vol 38 ◽  
pp. 02027
Author(s):  
Ma Lei ◽  
Zhang Nana ◽  
Zhang Zhongqiu

One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.


2013 ◽  
Vol 117 (1) ◽  
pp. 185-204 ◽  
Author(s):  
Holger Pagel ◽  
Joachim Ingwersen ◽  
Christian Poll ◽  
Ellen Kandeler ◽  
Thilo Streck

2020 ◽  
Author(s):  
Peter D. Kvam ◽  
Jerome R Busemeyer ◽  
Timothy Joseph Pleskac

Contemporary theories of choice posit that decision making is a constructive process in which a decision maker uses information about the choice options to generate support for various decisions and judgments, then uses these decisions and judgments to reduce their uncertainty about their own preferences. Here we examine how these constructive processes unfold by tracking dynamic changes in preference strength. Across two experiments, we observed that mean preference strength oscillated over time and found that eliciting a choice strongly affected the pattern of oscillation. Preferences following choices oscillated between being stronger than those without prior choice (bolstering) and being weaker than those without choice (suppression). An open system model, merging epistemic uncertainty about how a person reacts to options and ontic uncertainty about how their preference is affected by choice, accounts for the oscillations resulting in both bolstering and suppression effects.


Author(s):  
Z. H. Aliyu ◽  
B. Sani

In this study, we developed an inventory system model under two – level trade credit where the supplier considers the retailer as credit risk but the retailer considers the customers as credit worthy. Therefore, the retailer is given a trade credit period on  proportion of the goods ordered whenever he/she pays for proportion of the goods immediately after delivery. In the same vein, the retailer passes the same grace to the customers but without attaching any condition as the customers are assumed credit worthy. This partial upstream trade credit is offered to reduce the risk of failure in payment on the business transaction especially that most retailers are involved in bulk orders. The relevant cost functions are determined and a numerical example is given. Sensitivity analysis was carried out to see the effect of changes in parameters on the optimal solution of the model.


Author(s):  
Mathew Bussière ◽  
Mark Stephens ◽  
Marzie Derakhshesh ◽  
Yue Cheng ◽  
Lorne Daniels

Abstract A better understanding of the sensitivity threshold of external leak detection systems can assist pipeline operators in predicting detection performance for a range of possible leak scenarios, thereby helping them to make more informed decisions regarding procurement and deployment of such systems. The analysis approach described herein was developed to characterize the leak detection sensitivity of select fiber optic cable-based systems that employ Distributed Acoustic Sensing (DAS). The detection sensitivity analysis consisted of two steps. The first step involved identifying a suitable release parameter capable of providing a defensible basis for defining detection sensitivity; the second step involved the application of logistic regression analysis to characterize detection sensitivity as a function of the chosen release parameter. The detection sensitivity analysis described herein provides a means by which to quantitatively determine the leak detection sensitivity threshold for each technology and sensor deployment position evaluated in a set of full-scale tests. The chosen sensitivity threshold measure was the release parameter value associated with release events having a 90% probability of being detected. Thresholds associated with a higher probability level of 95% were also established for comparison purposes. The calculated sensitivity thresholds can be interpreted to mean that release events associated with release parameter values above the sensitivity threshold have a very high likelihood (either 90 or 95%) of being detected.


Sign in / Sign up

Export Citation Format

Share Document