scholarly journals Comparative analysis of model behaviour for flood prediction purposes using Self-Organizing Maps

2009 ◽  
Vol 9 (2) ◽  
pp. 373-392 ◽  
Author(s):  
M. Herbst ◽  
M. C. Casper ◽  
J. Grundmann ◽  
O. Buchholz

Abstract. Distributed watershed models constitute a key component in flood forecasting systems. It is widely recognized that models because of their structural differences have varying capabilities of capturing different aspects of the system behaviour equally well. Of course, this also applies to the reproduction of peak discharges by a simulation model which is of particular interest regarding the flood forecasting problem. In our study we use a Self-Organizing Map (SOM) in combination with index measures which are derived from the flow duration curve in order to examine the conditions under which three different distributed watershed models are capable of reproducing flood events present in the calibration data. These indices are specifically conceptualized to extract data on the peak discharge characteristics of model output time series which are obtained from Monte-Carlo simulations with the distributed watershed models NASIM, LARSIM and WaSIM-ETH. The SOM helps to analyze this data by producing a discretized mapping of their distribution in the index space onto a two dimensional plane such that their pattern and consequently the patterns of model behaviour can be conveyed in a comprehensive manner. It is demonstrated how the SOM provides useful information about details of model behaviour and also helps identifying the model parameters that are relevant for the reproduction of peak discharges and thus for flood prediction problems. It is further shown how the SOM can be used to identify those parameter sets from among the Monte-Carlo data that most closely approximate the peak discharges of a measured time series. The results represent the characteristics of the observed time series with partially superior accuracy than the reference simulation obtained by implementing a simple calibration strategy using the global optimization algorithm SCE-UA. The most prominent advantage of using SOM in the context of model analysis is that it allows to comparatively evaluating the data from two or more models. Our results highlight the individuality of the model realizations in terms of the index measures and shed a critical light on the use and implementation of simple and yet too rigorous calibration strategies.

2008 ◽  
Vol 5 (6) ◽  
pp. 3517-3555 ◽  
Author(s):  
M. Herbst ◽  
H. V. Gupta ◽  
M. C. Casper

Abstract. Hydrological model evaluation and identification essentially depends on the extraction of information from model time series and its processing. However, the type of information extracted by statistical measures has only very limited meaning because it does not relate to the hydrological context of the data. To overcome this inadequacy we exploit the diagnostic evaluation concept of Signature Indices, in which model performance is measured using theoretically relevant characteristics of system behaviour. In our study, a Self-Organizing Map (SOM) is used to process the Signatures extracted from Monte-Carlo simulations generated by a distributed conceptual watershed model. The SOM creates a hydrologically interpretable mapping of overall model behaviour, which immediately reveals deficits and trade-offs in the ability of the model to represent the different functional behaviours of the watershed. Further, it facilitates interpretation of the hydrological functions of the model parameters and provides preliminary information regarding their sensitivities. Most notably, we use this mapping to identify the set of model realizations (among the Monte-Carlo data) that most closely approximate the observed discharge time series in terms of the hydrologically relevant characteristics, and to confine the parameter space accordingly. Our results suggest that Signature Index based SOMs could potentially serve as tools for decision makers inasmuch as model realizations with specific Signature properties can be selected according to the purpose of the model application. Moreover, given that the approach helps to represent and analyze multi-dimensional distributions, it could be used to form the basis of an optimization framework that uses SOMs to characterize the model performance response surface. As such it provides a powerful and useful way to conduct model identification and model uncertainty analyses.


2009 ◽  
Vol 13 (3) ◽  
pp. 395-409 ◽  
Author(s):  
M. Herbst ◽  
H. V. Gupta ◽  
M. C. Casper

Abstract. Hydrological model evaluation and identification essentially involves extracting and processing information from model time series. However, the type of information extracted by statistical measures has only very limited meaning because it does not relate to the hydrological context of the data. To overcome this inadequacy we exploit the diagnostic evaluation concept of Signature Indices, in which model performance is measured using theoretically relevant characteristics of system behaviour. In our study, a Self-Organizing Map (SOM) is used to process the Signatures extracted from Monte-Carlo simulations generated by the distributed conceptual watershed model NASIM. The SOM creates a hydrologically interpretable mapping of overall model behaviour, which immediately reveals deficits and trade-offs in the ability of the model to represent the different functional behaviours of the watershed. Further, it facilitates interpretation of the hydrological functions of the model parameters and provides preliminary information regarding their sensitivities. Most notably, we use this mapping to identify the set of model realizations (among the Monte-Carlo data) that most closely approximate the observed discharge time series in terms of the hydrologically relevant characteristics, and to confine the parameter space accordingly. Our results suggest that Signature Index based SOMs could potentially serve as tools for decision makers inasmuch as model realizations with specific Signature properties can be selected according to the purpose of the model application. Moreover, given that the approach helps to represent and analyze multi-dimensional distributions, it could be used to form the basis of an optimization framework that uses SOMs to characterize the model performance response surface. As such it provides a powerful and useful way to conduct model identification and model uncertainty analyses.


1997 ◽  
Vol 36 (5) ◽  
pp. 141-148 ◽  
Author(s):  
A. Mailhot ◽  
É. Gaume ◽  
J.-P. Villeneuve

The Storm Water Management Model's quality module is calibrated for a section of Québec City's sewer system using data collected during five rain events. It is shown that even for this simple model, calibration can fail: similarly a good fit between recorded data and simulation results can be obtained with quite different sets of model parameters, leading to great uncertainty on calibrated parameter values. In order to further investigate the lack of data and data uncertainty impacts on calibration, we used a new methodology based on the Metropolis Monte Carlo algorithm. This analysis shows that for a large amount of calibration data generated by the model itself, small data uncertainties are necessary to significantly decrease calibrated parameter uncertainties. This also confirms the usefulness of the Metropolis algorithm as a tool for uncertainty analysis in the context of model calibration.


2011 ◽  
Vol 8 (2) ◽  
pp. 2423-2476 ◽  
Author(s):  
T. Krauße ◽  
J. Cullmann

Abstract. The development of methods for estimating the parameters of hydrological models considering uncertainties has been of high interest in hydrological research over the last years. Besides the very popular Markov Chain Monte Carlo (MCMC) methods which estimate the uncertainty of model parameters in the settings of a Bayesian framework, the development of depth based sampling methods, also entitled robust parameter estimation (ROPE), have attracted an increasing research interest. These methods understand the estimation of model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth. Recent studies showed that the parameter vectors estimated by depth based sampling perform more robust in validation. One major advantage of this kind of approach over the MCMC methods is that the formulation of a likelihood function within a Bayesian uncertainty framework gets obsolete and arbitrary purpose-oriented performance criteria defined by the user can be integrated without any further complications. In this paper we present an advanced ROPE method entitled the Advanced Robust Parameter Estimation by Monte Carlo algorithm (AROPEMC). The AROPEMC algorithm is a modified version of the original robust parameter estimation algorithm ROPEMC developed by Bárdossy and Singh (2008). AROPEMC performs by merging iterative Monte Carlo simulations, identifying well performing parameter vectors, the sampling of robust parameter vectors according to the principle of data depth and the application of a well-founded stopping criterion applied in supervised machine learning. The principals of the algorithm are illustrated by means of the Rosenbrock's and Rastrigin's function, two well known performance benchmarks for optimisation algorithms. Two case studies demonstrate the advantage of AROPEMC compared to state of the art global optimisation algorithms. A distributed process-oriented hydrological model is calibrated and validated for flood forecasting in a small catchment characterised by extreme process dynamics.


2007 ◽  
Vol 4 (6) ◽  
pp. 3953-3978 ◽  
Author(s):  
M. Herbst ◽  
M. C. Casper

Abstract. The reduction of information contained in model time series through the use of aggregating statistical measures is very high compared to the amount of information that one would like to draw from it for model identification and calibration purposes. Applied within a model identification context, aggregating statistical performance measures are inadequate to capture details on time series characteristics. It has been readily shown that this loss of information on the residuals imposes important limitations on model identification and -diagnostics and thus constitutes an element of the overall model uncertainty. In this contribution we present an approach using a Self-Organizing Map (SOM) to circumvent the identifiability problem induced by the low discriminatory power of aggregating performance measures. Instead, a Self-Organizing Map is used to differentiate the spectrum of model realizations, obtained from Monte-Carlo simulations with a distributed conceptual watershed model, based on the recognition of different patterns in time series. Further, the SOM is used instead of a classical optimization algorithm to identify the model realizations among the Monte-Carlo simulations that most closely approximate the pattern of the measured discharge time series. The results are analyzed and compared with the manually calibrated model as well as with the results of the Shuffled Complex Evolution algorithm (SCE-UA).


2008 ◽  
Vol 12 (2) ◽  
pp. 657-667 ◽  
Author(s):  
M. Herbst ◽  
M. C. Casper

Abstract. The reduction of information contained in model time series through the use of aggregating statistical performance measures is very high compared to the amount of information that one would like to draw from it for model identification and calibration purposes. It has been readily shown that this loss imposes important limitations on model identification and -diagnostics and thus constitutes an element of the overall model uncertainty. In this contribution we present an approach using a Self-Organizing Map (SOM) to circumvent the identifiability problem induced by the low discriminatory power of aggregating performance measures. Instead, a Self-Organizing Map is used to differentiate the spectrum of model realizations, obtained from Monte-Carlo simulations with a distributed conceptual watershed model, based on the recognition of different patterns in time series. Further, the SOM is used instead of a classical optimization algorithm to identify those model realizations among the Monte-Carlo simulation results that most closely approximate the pattern of the measured discharge time series. The results are analyzed and compared with the manually calibrated model as well as with the results of the Shuffled Complex Evolution algorithm (SCE-UA). In our study the latter slightly outperformed the SOM results. The SOM method, however, yields a set of equivalent model parameterizations and therefore also allows for confining the parameter space to a region that closely represents a measured data set. This particular feature renders the SOM potentially useful for future model identification applications.


2006 ◽  
Vol 19 (4) ◽  
pp. 564-578 ◽  
Author(s):  
Xin Zhao ◽  
Pao-Shin Chu

Abstract A Bayesian framework is developed to detect multiple abrupt shifts in a time series of the annual major hurricanes counts. The hurricane counts are modeled by a Poisson process where the Poisson intensity (i.e., hurricane rate) is codified by a gamma distribution. Here, a triple hypothesis space concerning the annual hurricane rate is considered: “a no change in the rate,” “a single change in the rate,” and “a double change in the rate.” A hierarchical Bayesian approach involving three layers—data, parameter, and hypothesis—is formulated to demonstrate the posterior probability of each possible hypothesis and its relevant model parameters through a Markov chain Monte Carlo (MCMC) method. Based on sampling from an estimated informative prior for the Poisson rate parameters and the posterior distribution of hypotheses, two simulated examples are illustrated to show the effectiveness of the proposed method. Subsequently, the methodology is applied to the time series of major hurricane counts over the eastern North Pacific (ENP). Results indicate that the hurricane activity over ENP has very likely undergone a decadal variation with two changepoints occurring around 1982 and 1999 with three epochs characterized by the inactive 1972–81 epoch, the active 1982–98 epoch, and the inactive 1999–2003 epoch. The Bayesian method also provides a means for predicting decadal major hurricane variations. A lower number of major hurricanes are predicted for the next decade given the recent inactive period of hurricane activity.


2020 ◽  
Author(s):  
Naoki Koyama ◽  
Tadashi Yamada

<p>The aim of this paper is to verify the accuracy of the real-time flood prediction model, using the time-series analysis. Forecast information of water level is important information that encourages residents to evacuate. Generally, flood forecasting is conducted by using runoff analysis. However, in developing countries, there are not enough hydrological data in a basin. Therefore, this study assumes where poor hydrologic data basin and evaluates it through reproducibility and prediction by using time series analysis which statistical model with the water level data and rainfall data. The model is applied to the one catchment of the upper Tone River basin, one of the first grade river in Japan. This method is possible to reproduce hydrograph, if the observation stations exist several points in the basin. And using the estimated parameters from past flood events, we can apply this method to predict the water level until the flood concentration time which the reference point and observation station. And until this time, the peak water level can be predicted with the accuracy of several 10cm. Prediction can be performed using only water level data, but by adding rainfall data, prediction can be performed for a longer time.</p>


1970 ◽  
Vol 1 (3) ◽  
pp. 181-205 ◽  
Author(s):  
ERIK ERIKSSON

The term “stochastic hydrology” implies a statistical approach to hydrologic problems as opposed to classic hydrology which can be considered deterministic in its approach. During the International Hydrology Symposium, held 6-8 September 1967 at Fort Collins, a number of hydrology papers were presented consisting to a large extent of studies on long records of hydrological elements such as river run-off, these being treated as time series in the statistical sense. This approach is, no doubt, of importance for future work especially in relation to prediction problems, and there seems to be no fundamental difficulty for introducing the stochastic concepts into various hydrologic models. There is, however, some developmental work required – not to speak of educational in respect to hydrologists – before the full benefit of the technique is obtained. The present paper is to some extent an exercise in the statistical study of hydrological time series – far from complete – and to some extent an effort to interpret certain features of such time series from a physical point of view. The material used is 30 years of groundwater level observations in an esker south of Uppsala, the observations being discussed recently by Hallgren & Sands-borg (1968).


Sign in / Sign up

Export Citation Format

Share Document