3D Numerical Temperature Model Development and Calibration for Lakes and Reservoirs: A Case Study

Author(s):  
Hussein A. M. Al-Zubaidi ◽  
Scott A. Wells
Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1044
Author(s):  
Yassine Bouabdallaoui ◽  
Zoubeir Lafhaj ◽  
Pascal Yim ◽  
Laure Ducoulombier ◽  
Belkacem Bennadji

The operation and maintenance of buildings has seen several advances in recent years. Multiple information and communication technology (ICT) solutions have been introduced to better manage building maintenance. However, maintenance practices in buildings remain less efficient and lead to significant energy waste. In this paper, a predictive maintenance framework based on machine learning techniques is proposed. This framework aims to provide guidelines to implement predictive maintenance for building installations. The framework is organised into five steps: data collection, data processing, model development, fault notification and model improvement. A sport facility was selected as a case study in this work to demonstrate the framework. Data were collected from different heating ventilation and air conditioning (HVAC) installations using Internet of Things (IoT) devices and a building automation system (BAS). Then, a deep learning model was used to predict failures. The case study showed the potential of this framework to predict failures. However, multiple obstacles and barriers were observed related to data availability and feedback collection. The overall results of this paper can help to provide guidelines for scientists and practitioners to implement predictive maintenance approaches in buildings.


World ◽  
2020 ◽  
Vol 1 (3) ◽  
pp. 205-215
Author(s):  
Joshua Mullenite

In this article, I review a cross-section of research in socio-hydrology from across disciplines in order to better understand the current role of historical-archival analysis in the development of socio-hydrological scholarship. I argue that despite its widespread use in environmental history, science and technology studies, anthropology, and human geography, archival methods are currently underutilized in socio-hydrological scholarship more broadly, particularly in the development of socio-hydrological models. Drawing on archival research conducted in relation to the socio-hydrology of coastal Guyana, I demonstrate the ways in which such scholarship can be readily incorporated into model development.


Author(s):  
Michael Gorelik ◽  
Jacob Obayomi ◽  
Jack Slovisky ◽  
Dan Frias ◽  
Howie Swanson ◽  
...  

While turbine engine Original Equipment Manufacturers (OEMs) accumulated significant experience in the application of probabilistic methods (PM) and uncertainty quantification (UQ) methods to specific technical disciplines and engine components, experience with system-level PM applications has been limited. To demonstrate the feasibility and benefits of an integrated PM-based system, a numerical case study has been developed around the Honeywell turbine engine application. The case study uses experimental observations of engine performance such as horsepower and fuel flow from a population of engines. Due to manufacturing variability, there are unit-to-unit and supplier-to-supplier variations in compressor blade geometry. Blade inspection data are available for the characterization of these geometric variations, and CFD analysis can be linked to the engine performance model, so that the effect of blade geometry variation on system-level performance characteristics can be quantified. Other elements of the case study included the use of engine performance and blade geometry data to perform Bayesian updating of the model inputs, such as efficiency adders and turbine tip clearances. A probabilistic engine performance model was developed, system-level sensitivity analysis performed, and the predicted distribution of engine performance metrics was calibrated against the observed distributions. This paper describes the model development approach and key simulation results. The benefits of using PM and UQ methods in the system-level framework are discussed. This case study was developed under Defense Advanced Research Projects Agency (DARPA) funding which is gratefully acknowledged.


2015 ◽  
Vol 8 (10) ◽  
pp. 3441-3470 ◽  
Author(s):  
J. A. Bradley ◽  
A. M. Anesio ◽  
J. S. Singarayer ◽  
M. R. Heath ◽  
S. Arndt

Abstract. SHIMMER (Soil biogeocHemIcal Model for Microbial Ecosystem Response) is a new numerical modelling framework designed to simulate microbial dynamics and biogeochemical cycling during initial ecosystem development in glacier forefield soils. However, it is also transferable to other extreme ecosystem types (such as desert soils or the surface of glaciers). The rationale for model development arises from decades of empirical observations in glacier forefields, and enables a quantitative and process focussed approach. Here, we provide a detailed description of SHIMMER, test its performance in two case study forefields: the Damma Glacier (Switzerland) and the Athabasca Glacier (Canada) and analyse sensitivity to identify the most sensitive and unconstrained model parameters. Results show that the accumulation of microbial biomass is highly dependent on variation in microbial growth and death rate constants, Q10 values, the active fraction of microbial biomass and the reactivity of organic matter. The model correctly predicts the rapid accumulation of microbial biomass observed during the initial stages of succession in the forefields of both the case study systems. Primary production is responsible for the initial build-up of labile substrate that subsequently supports heterotrophic growth. However, allochthonous contributions of organic matter, and nitrogen fixation, are important in sustaining this productivity. The development and application of SHIMMER also highlights aspects of these systems that require further empirical research: quantifying nutrient budgets and biogeochemical rates, exploring seasonality and microbial growth and cell death. This will lead to increased understanding of how glacier forefields contribute to global biogeochemical cycling and climate under future ice retreat.


Risks ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 204
Author(s):  
Chamay Kruger ◽  
Willem Daniel Schutte ◽  
Tanja Verster

This paper proposes a methodology that utilises model performance as a metric to assess the representativeness of external or pooled data when it is used by banks in regulatory model development and calibration. There is currently no formal methodology to assess representativeness. The paper provides a review of existing regulatory literature on the requirements of assessing representativeness and emphasises that both qualitative and quantitative aspects need to be considered. We present a novel methodology and apply it to two case studies. We compared our methodology with the Multivariate Prediction Accuracy Index. The first case study investigates whether a pooled data source from Global Credit Data (GCD) is representative when considering the enrichment of internal data with pooled data in the development of a regulatory loss given default (LGD) model. The second case study differs from the first by illustrating which other countries in the pooled data set could be representative when enriching internal data during the development of a LGD model. Using these case studies as examples, our proposed methodology provides users with a generalised framework to identify subsets of the external data that are representative of their Country’s or bank’s data, making the results general and universally applicable.


2010 ◽  
Vol 2 (2) ◽  
pp. 38-51 ◽  
Author(s):  
Marc Halbrügge

Keep it simple - A case study of model development in the context of the Dynamic Stocks and Flows (DSF) taskThis paper describes the creation of a cognitive model submitted to the ‘Dynamic Stocks and Flows’ (DSF) modeling challenge. This challenge aims at comparing computational cognitive models for human behavior during an open ended control task. Participants in the modeling competition were provided with a simulation environment and training data for benchmarking their models while the actual specification of the competition task was withheld. To meet this challenge, the cognitive model described here was designed and optimized for generalizability. Only two simple assumptions about human problem solving were used to explain the empirical findings of the training data. In-depth analysis of the data set prior to the development of the model led to the dismissal of correlations or other parametric statistics as goodness-of-fit indicators. A new statistical measurement based on rank orders and sequence matching techniques is being proposed instead. This measurement, when being applied to the human sample, also identifies clusters of subjects that use different strategies for the task. The acceptability of the fits achieved by the model is verified using permutation tests.


2014 ◽  
Author(s):  
◽  
Oluwaseun Kunle Oyebode

Streamflow modelling remains crucial to decision-making especially when it concerns planning and management of water resources systems in water-stressed regions. This study proposes a suitable method for streamflow modelling irrespective of the limited availability of historical datasets. Two data-driven modelling techniques were applied comparatively so as to achieve this aim. Genetic programming (GP), an evolutionary algorithm approach and a differential evolution (DE)-trained artificial neural network (ANN) were used for streamflow prediction in the upper Mkomazi River, South Africa. Historical records of streamflow and meteorological variables for a 19-year period (1994- 2012) were used for model development and also in the selection of predictor variables into the input vector space of the models. In both approaches, individual monthly predictive models were developed for each month of the year using a 1-year lead time. Two case studies were considered in development of the ANN models. Case study 1 involved the use of correlation analysis in selecting input variables as employed during GP model development, while the DE algorithm was used for training and optimizing the model parameters. However in case study 2, genetic programming was incorporated as a screening tool for determining the dimensionality of the ANN models, while the learning process was further fine-tuned by subjecting the DE algorithm to sensitivity analysis. Altogether, the performance of the three sets of predictive models were evaluated comparatively using three statistical measures namely, Mean Absolute Percent Error (MAPE), Root Mean-Squared Error (RMSE) and coefficient of determination (R2). Results showed better predictive performance by the GP models both during the training and validation phases when compared with the ANNs. Although the ANN models developed in case study 1 gave satisfactory results during the training phase, they were unable to extensively replicate those results during the validation phase. It was found that results from case study 1 were considerably influenced by the problems of overfitting and memorization, which are typical of ANNs when subjected to small amount of datasets. However, results from case study 2 showed great improvement across the three evaluation criteria, as the overfitting and memorization problems were significantly minimized, thus leading to improved accuracy in the predictions of the ANN models. It was concluded that the conjunctive use of the two evolutionary computation methods (GP and DE) can be used to improve the performance of artificial neural networks models, especially when availability of datasets is limited. In addition, the GP models can be deployed as predictive tools for the purpose of planning and management of water resources within the Mkomazi region and KwaZulu-Natal province as a whole.


2007 ◽  
Vol 30 (2) ◽  
pp. 115-124 ◽  
Author(s):  
Sarah J. Fielden ◽  
Melanie L. Rusch ◽  
Mambo Tabu Masinda ◽  
Jim Sands ◽  
Jim Frankish ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document