scholarly journals Calibrating a watershed simulation model involving human interference: an application of multi-objective genetic algorithms

2008 ◽  
Vol 10 (1) ◽  
pp. 97-111 ◽  
Author(s):  
Mohamad I. Hejazi ◽  
Ximing Cai ◽  
Deva K. Borah

We calibrate a storm-event distributed hydrologic model to a watershed, in which runoff is significantly affected by reservoir storage and release, using a multi-objective genetic algorithm (NSGA-II). This paper addresses the following questions: What forms of the objective (fitness) function used in the optimization model will result in a better calibration? How does the error in reservoir release caused by neglected human interference or the imprecise storage–release function affect the calibration? Reservoir release is studied as a specific (and popular) form of human interference. Two procedures for handling reservoir releases are tested and compared: (1) treating reservoir releases to be solely determined by the hydraulic structure (predefined storage or stage-discharge relations) as if perfect, a procedure usually adopted in watershed model calibration; or (2) adding reservoir releases that are determined by the storage–discharge relation to an error term. The error term encompasses a time-variant human interference and a discharge function error, and is determined through an optimization-based calibration procedure. It is found that the calibration procedure with consideration of human interference not only results in a better match of modeled and observed hydrograph, but also more reasonable model parameters in terms of their spatial distribution and the robustness of the parameter values.

Regression testing is one of the most critical testing activities among software product verification activities. Nevertheless, resources and time constraints could inhibit the execution of a full regression test suite, hence leaving us in confusion on what test cases to run to preserve the high quality of software products. Different techniques can be applied to prioritize test cases in resource-constrained environments, such as manual selection, automated selection, or hybrid approaches. Different Multi-Objective Evolutionary Algorithms (MOEAs) have been used in this domain to find an optimal solution to minimize the cost of executing a regression test suite while obtaining maximum fault detection coverage as if the entire test suite was executed. MOEAs achieve this by selecting set of test cases and determining the order of their execution. In this paper, three Multi Objective Evolutionary Algorithms, namely, NSGA-II, IBEA and MoCell are used to solve test case prioritization problems using the fault detection rate and branch coverage of each test case. The paper intends to find out what’s the most effective algorithm to be used in test cases prioritization problems, and which algorithm is the most efficient one, and finally we examined if changing the fitness function would impose a change in results. Our experiment revealed that NSGA-II is the most effective and efficient MOEA; moreover, we found that changing the fitness function caused a significant reduction in evolution time, although it did not affect the coverage metric.


2016 ◽  
Vol 75 (4) ◽  
pp. 823-832 ◽  
Author(s):  
Farhad Hooshyaripor ◽  
Jafar Yazdi

This research presents a simulation-optimization model for urban flood mitigation integrating Non-dominated Sorting Genetic Algorithm (NSGA-II) with Storm Water Management Model (SWMM) hydraulic model under a curve number-based hydrologic model of low impact development technologies in Gonbad-e-Kavus, a small city in the north of Iran. In the developed model, the best performance of the system relies on the optimal layout and capacity of retention ponds over the study area in order to reduce surcharge from the manholes underlying a set of storm event loads, while the available investment plays a restricting role. Thus, there is a multi-objective optimization problem with two conflicting objectives solved successfully by NSGA-II to find a set of optimal solutions known as the Pareto front. In order to analyze the results, a new factor, investment priority index (IPI), is defined which shows the risk of surcharging over the network and priority of the mitigation actions. The IPI is calculated using the probability of pond selection for candidate locations and average depth of the ponds in all Pareto front solutions. The IPI can help the decision makers to arrange a long-term progressive plan with the priority of high-risk areas when an optimal solution has been selected.


2021 ◽  
Author(s):  
Ikechukwu Chibueze ◽  
Chizoba Obele ◽  
CHIDOZIE NWOBI-OKOYE ◽  
Clement Atuanya

Abstract Development of mathematical models for prediction of properties of materials is often complex and cumbersome. This led to the advent of simpler, and often more accurate, computational models based on artificial intelligence for predicting materials properties. The aim of this study is to predict the mechanical properties of a newly developed hybrid composite material made with sponge gourd, baggase and epoxy resin for golf club application using fuzzy logic (FL) and carry out a multi-objective optimization of the properties with modified desirability function (DF) and NSGA II algorithm. The inputs were %Wt of baggase, %Wt of Sponge gourd and Fiber size (µm) while the response variables were tensile strength, hardness, flexural strength, modulus, elongation and impact strength. The FL model was separately coupled, as fitness function, with the modified DF algorithm and the NSGA II algorithm respectively. The DF was optimized with particle swarm optimization (PSO) algorithm. The results showed that the FL model predicted the mechanical properties accurately and the minimum correlation coefficient (R) between the experimental responses and FL predictions was 0.9529. The modified algorithms took care of certain peculiarities in the desirability properties such as elongation whose desirability is constant over a range. The optimized properties were found to be worse if the optimization algorithms were not modified.


2012 ◽  
Vol 20 (4) ◽  
pp. 35-43 ◽  
Author(s):  
Peter Valent ◽  
Ján Szolgay ◽  
Carlo Riverso

ABSTRACTMost of the studies that assess the performance of various calibration techniques have todeal with a certain amount of uncertainty in the calibration data. In this study we testedHBV model calibration procedures in hypothetically ideal conditions under the assumptionof no errors in the measured data. This was achieved by creating an artificial time seriesof the flows created by the HBV model using the parameters obtained from calibrating themeasured flows. The artificial flows were then used to replace the original flows in thecalibration data, which was then used for testing how calibration procedures can reproduceknown model parameters. The results showed that in performing one hundred independentcalibration runs of the HBV model, we did not manage to obtain parameters that werealmost identical to those used to create the artificial flow data without a certain degree ofuncertainty. Although the calibration procedure of the model works properly froma practical point of view, it can be regarded as a demonstration of the equifinality principle,since several parameter sets were obtained which led to equally acceptable or behaviouralrepresentations of the observed flows. The study demonstrated that this concept forassessing how uncertain hydrological predictions can be applied in the further developmentof a model or the choice of calibration method using artificially generated data.


2021 ◽  
Author(s):  
Jared Smith ◽  
Laurence Lin ◽  
Julianne Quinn ◽  
Lawrence Band

<p>Urban land expansion is expected for our changing world, which unmitigated will result in increased flooding and nutrient exports that already wreak havoc on the wellbeing of coupled human-natural systems worldwide. Reforestation of urbanized catchments is one green infrastructure strategy to reduce stormwater volumes and nutrient exports. Reforestation designs must balance the benefits of flood flow reduction against the costs of implementation and the chance to exacerbate droughts via reduction in recharge that supplies low flows. Optimal locations and numbers of trees depend on the spatial distribution of runoff and streamflow in a catchment; however, calibration data are often only available at the catchment outlet. Equifinal model parameterizations for the outlet can result in uncertainty in the locations and magnitudes of streamflows across the catchment, which can lead to different optimal reforestation designs for different parameterizations.</p><p>Multi-objective robust optimization (MORO) has been proposed to discover reforestation designs that are robust to such parametric model uncertainty. However, it has not been shown that this actually results in better decisions than optimizing to a single, most likely parameter set, which would be less computationally expensive. In this work, the utility of MORO is assessed by comparing reforestation designs optimized using these two approaches with reforestation designs optimized to a synthetic true set of hydrologic model parameters. The spatially-distributed RHESSys ecohydrological model is employed for this study of a suburban-forested catchment in Baltimore County, Maryland, USA. Calibration of the model’s critical parameters is completed using a Bayesian framework to estimate the joint posterior distribution of the parameters. The Bayesian framework estimates the probability that different parameterizations generated the synthetic streamflow data, allowing the MORO process to evaluate reforestation portfolios across a probability-weighted sample of parameter sets in search of solutions that are robust to this uncertainty.</p><p>Reforestation portfolios are designed to minimize flooding, low flow intensity, and construction costs (number of trees). Comparing the Pareto front obtained from using MORO with the Pareto fronts obtained from optimizing to the estimated maximum a posteriori (MAP) parameter set and the synthetic true parameter set, we find that MORO solutions are closer to the synthetic solutions than are MAP solutions. This illustrates the value of considering parametric uncertainty in designing robust water systems despite the additional computational cost.</p>


2009 ◽  
Vol 13 (11) ◽  
pp. 2137-2149 ◽  
Author(s):  
M. Shafii ◽  
F. De Smedt

Abstract. A multi-objective genetic algorithm, NSGA-II, is applied to calibrate a distributed hydrological model (WetSpa) for prediction of river discharges. The goals of this study include (i) analysis of the applicability of multi-objective approach for WetSpa calibration instead of the traditional approach, i.e. the Parameter ESTimator software (PEST), and (ii) identifiability assessment of model parameters. The objective functions considered are model efficiency (Nash-Sutcliffe criterion) known to be biased for high flows, and model efficiency for logarithmic transformed discharges to emphasize low-flow values. For the multi-objective approach, Pareto-optimal parameter sets are derived, whereas for the single-objective formulation, PEST is applied to give optimal parameter sets. The two approaches are evaluated by applying the WetSpa model to predict daily discharges in the Hornad River (Slovakia) for a 10 year period (1991–2000). The results reveal that NSGA-II performs favourably well to locate Pareto optimal solutions in the parameters search space. Furthermore, identifiability analysis of the WetSpa model parameters shows that most parameters are well-identifiable. However, in order to perform an appropriate model evaluation, more efforts should be focused on improving calibration concepts and to define robust methods to quantify different sources of uncertainties involved in the calibration procedure.


Author(s):  
Nikita Rawat ◽  
Padmanabh Thakur ◽  
Utkarsh Jadli

The estimation of the electrical model parameters of solar PV, such as light-induced current, diode dark saturation current, thermal voltage, series resistance, and shunt resistance, is indispensable to predict the actual electrical performance of solar photovoltaic (PV) under changing environmental conditions. Therefore, this paper first considers the various methods of parameter estimation of solar PV to highlight their shortfalls. Thereafter, a new parameter estimation method, based on multi-objective optimisation, namely, Non-dominated Sorting Genetic Algorithm-II (NSGA-II), is proposed. Furthermore, to check the effectiveness and accuracy of the proposed method, conventional methods, such as, ‘Newton-Raphson’, ‘Particle Swarm Optimisation, Search Algorithm, was tested on four solar PV modules of polycrystalline and monocrystalline materials. Finally, a solar PV module photowatt PWP201 has been considered and compared with six different state of art methods. The estimated performance indices such as current absolute error matrics, absolute relative power error, mean absolute error, and P-V characteristics curve were compared. The results depict the close proximity of the characteristic curve obtained with the proposed NSGA-II method to the curve obtained by the manufacturer’s datasheet.


2020 ◽  
Vol 24 (6) ◽  
pp. 3189-3209
Author(s):  
Céline Monteil ◽  
Fabrice Zaoui ◽  
Nicolas Le Moine ◽  
Frédéric Hendrickx

Abstract. Environmental modelling is complex, and models often require the calibration of several parameters that are not able to be directly evaluated from a physical quantity or field measurement. Multi-objective calibration has many advantages such as adding constraints in a poorly constrained problem or finding a compromise between different objectives by defining a set of optimal parameters. The caRamel optimizer has been developed to meet the requirement for an automatic calibration procedure that delivers not just one but a family of parameter sets that are optimal with regard to a multi-objective target. The idea behind caRamel is to rely on stochastic rules while also allowing more “local” mechanisms, such as the extrapolation along vectors in the parameter space. The caRamel algorithm is a hybrid of the multi-objective evolutionary annealing simplex (MEAS) method and the non-dominated sorting genetic algorithm II (ε-NSGA-II). It was initially developed for calibrating hydrological models but can be used for any environmental model. The caRamel algorithm is well adapted to complex modelling. The comparison with other optimizers in hydrological case studies (i.e. NSGA-II and MEAS) confirms the quality of the algorithm. An R package, caRamel, has been designed to easily implement this multi-objective algorithm optimizer in the R environment.


Author(s):  
Jafar Roshanian ◽  
Ali A Bataleblu ◽  
Masoud Ebrahimi

Robustness and reliability of the designed trajectory are crucial for flight performance of launch vehicles. In this paper, robust trajectory design optimization of a typical LV is proposed. Two formulations of robust trajectory design optimization problem using single-objective and multi-objective optimization concept are presented. Both aleatory and epistemic uncertainties in model parameters and operational environment characteristics are incorporated in the problem, respectively. In order to uncertainty propagation and analysis, the improved Latin hypercube sampling is utilized. A comparison between robustness of the single-objective robust trajectory design optimization solution and deterministic design optimization solution is illustrated using probability density functions. The multi-objective robust trajectory design optimization is executed through NSGA-II and a set of feasible design points with a good spread is obtained in the form of Pareto frontier. The final Pareto frontier presents a trade-off between two conflicting objectives namely maximizing injection robustness and minimizing gross lift-off mass of launch vehicle. The resulted Pareto frontier of the multi-objective robust trajectory design optimization shows that with 1% increase in gross mass, the robustness of the design point to the considered uncertainties can be increased about 80%. Also, numerical simulation results show that the multi-objective formulation is a necessary approach to achieve a good trade-off between optimality and robustness.


Author(s):  
Luis V. Santana-Quintero ◽  
Noel Ramírez-Santiago ◽  
Carlos A. Coello Coello

This chapter presents a hybrid between a particle swarm optimization (PSO) approach and scatter search. The main motivation for developing this approach is to combine the high convergence rate of the PSO algorithm with a local search approach based on scatter search, in order to have the main advantages of these two types of techniques. We propose a new leader selection scheme for PSO, which aims to accelerate convergence by increasing the selection pressure. However, this higher selection pressure reduces diversity. To alleviate that, scatter search is adopted after applying PSO, in order to spread the solutions previously obtained, so that a better distribution along the Pareto front is achieved. The proposed approach can produce reasonably good approximations of multi-objective problems of high dimensionality, performing only 4,000 fitness function evaluations. Test problems taken from the specialized literature are adopted to validate the proposed hybrid approach. Results are compared with respect to the NSGA-II, which is an approach representative of the state-of-the-art in the area.


Sign in / Sign up

Export Citation Format

Share Document