scholarly journals Simulation of rainfall time series from different climatic regions using the direct sampling technique

2014 ◽  
Vol 18 (8) ◽  
pp. 3015-3031 ◽  
Author(s):  
F. Oriani ◽  
J. Straubhaar ◽  
P. Renard ◽  
G. Mariethoz

Abstract. The direct sampling technique, belonging to the family of multiple-point statistics, is proposed as a nonparametric alternative to the classical autoregressive and Markov-chain-based models for daily rainfall time-series simulation. The algorithm makes use of the patterns contained inside the training image (the past rainfall record) to reproduce the complexity of the signal without inferring its prior statistical model: the time series is simulated by sampling the training data set where a sufficiently similar neighborhood exists. The advantage of this approach is the capability of simulating complex statistical relations by respecting the similarity of the patterns at different scales. The technique is applied to daily rainfall records from different climate settings, using a standard setup and without performing any optimization of the parameters. The results show that the overall statistics as well as the dry/wet spells patterns are simulated accurately. Also the extremes at the higher temporal scale are reproduced adequately, reducing the well known problem of overdispersion.

2014 ◽  
Vol 11 (3) ◽  
pp. 3213-3247 ◽  
Author(s):  
F. Oriani ◽  
J. Straubhaar ◽  
P. Renard ◽  
G. Mariethoz

Abstract. The Direct Sampling technique, belonging to the family of multiple-point statistics, is proposed as a non-parametric alternative to the classical autoregressive and Markov-chain based models for daily rainfall time-series simulation. The algorithm makes use of the patterns contained inside the training image (the past rainfall record) to reproduce the complexity of the signal without inferring its prior statistical model: the time-series is simulated by sampling the training dataset where a sufficiently similar neighborhood exists. The advantage of this approach is the capability of simulating complex statistical relations by respecting the similarity of the patterns at different scales. The technique is applied to daily rainfall records from different climate settings, using a standard setup and without performing any optimization of the parameters. The results show that the overall statistics as well as the dry/wet spells patterns are simulated accurately. Also the extremes at the higher temporal scale are reproduced exhaustively, reducing the well known problem of over-dispersion.


Water ◽  
2018 ◽  
Vol 10 (10) ◽  
pp. 1477 ◽  
Author(s):  
Davide De Luca ◽  
Luciano Galasso

This study tests stationary and non-stationary approaches for modelling data series of hydro-meteorological variables. Specifically, the authors considered annual maximum rainfall accumulations observed in the Calabria region (southern Italy), and attention was focused on time series characterized by heavy rainfall events which occurred from 1 January 2000 in the study area. This choice is justified by the need to check if the recent rainfall events in the new century can be considered as very different or not from the events occurred in the past. In detail, the whole data set of each considered time series (characterized by a sample size N > 40 data) was analyzed, in order to compare recent and past rainfall accumulations, which occurred in a specific site. All the proposed models were based on the Two-Component Extreme Value (TCEV) probability distribution, which is frequently applied for annual maximum time series in Calabria. The authors discussed the possible sources of uncertainty related to each framework and remarked on the crucial role played by ergodicity. In fact, if the process is assumed to be non-stationary, then ergodicity cannot hold, and thus possible trends should be derived from external sources, different from the time series of interest: in this work, Regional Climate Models’ (RCMs) outputs were considered in order to assess possible trends of TCEV parameters. From the obtained results, it does not seem essential to adopt non-stationary models, as significant trends do not appear from the observed data, due to a relevant number of heavy events which also occurred in the central part of the last century.


2010 ◽  
Vol 7 (4) ◽  
pp. 4957-4994 ◽  
Author(s):  
R. Deidda

Abstract. Previous studies indicate the generalized Pareto distribution (GPD) as a suitable distribution function to reliably describe the exceedances of daily rainfall records above a proper optimum threshold, which should be selected as small as possible to retain the largest sample while assuring an acceptable fitting. Such an optimum threshold may differ from site to site, affecting consequently not only the GPD scale parameter, but also the probability of threshold exceedance. Thus a first objective of this paper is to derive some expressions to parameterize a simple threshold-invariant three-parameter distribution function which is able to describe zero and non zero values of rainfall time series by assuring a perfect overlapping with the GPD fitted on the exceedances of any threshold larger than the optimum one. Since the proposed distribution does not depend on the local thresholds adopted for fitting the GPD, it will only reflect the on-site climatic signature and thus appears particularly suitable for hydrological applications and regional analyses. A second objective is to develop and test the Multiple Threshold Method (MTM) to infer the parameters of interest on the exceedances of a wide range of thresholds using again the concept of parameters threshold-invariance. We show the ability of the MTM in fitting historical daily rainfall time series recorded with different resolutions. Finally, we prove the supremacy of the MTM fit against the standard single threshold fit, often adopted for partial duration series, by evaluating and comparing the performances on Monte Carlo samples drawn by GPDs with different shape and scale parameters and different discretizations.


2006 ◽  
Vol 10 (6) ◽  
pp. 807-815 ◽  
Author(s):  
E. Zehe ◽  
A. K. Singh ◽  
A. Bárdossy

Abstract. Within this study we present a robust method for generating precipitation time series for the Anas catchment in North Western India. The method employs a multivariate stochastic simulation model that is driven by a time series of objectively classified circulation patterns (CPs). In a companion study (Zehe et al., 2006) it was already shown that CPs classified from the 500 or 700 Hpa levels are suitable to explain space-time variability of precipitation in that area. The model is calibrated using observed rainfall time series for the period 1985–1992 for two different CP time series, one from the 500 Hpa level and the over from the 700 Hpa level, and 200 realizations of daily rainfall are simulated for the period 85–94. Simulations using the CPs from the 500 Hpa level as input yield a good match of the observed averages and standard deviations of daily rainfall. They show furthermore good performance at the monthly scale. When used with the 700 Hpa level CPs as inputs the model clearly underestimates the standard deviation and performs much worse at the monthly scale, especially in the validation period 93–94. The presented results give evidence that CPs from the 500 Hpa, level in combination with a multivariate stochastic model, make up a suitable tool for reducing the sparsity of precipitation data in developing regions with sparse hydro-meteorological data sets.


2008 ◽  
Vol 15 (6) ◽  
pp. 1013-1022 ◽  
Author(s):  
J. Son ◽  
D. Hou ◽  
Z. Toth

Abstract. Various statistical methods are used to process operational Numerical Weather Prediction (NWP) products with the aim of reducing forecast errors and they often require sufficiently large training data sets. Generating such a hindcast data set for this purpose can be costly and a well designed algorithm should be able to reduce the required size of these data sets. This issue is investigated with the relatively simple case of bias correction, by comparing a Bayesian algorithm of bias estimation with the conventionally used empirical method. As available forecast data sets are not large enough for a comprehensive test, synthetically generated time series representing the analysis (truth) and forecast are used to increase the sample size. Since these synthetic time series retained the statistical characteristics of the observations and operational NWP model output, the results of this study can be extended to real observation and forecasts and this is confirmed by a preliminary test with real data. By using the climatological mean and standard deviation of the meteorological variable in consideration and the statistical relationship between the forecast and the analysis, the Bayesian bias estimator outperforms the empirical approach in terms of the accuracy of the estimated bias, and it can reduce the required size of the training sample by a factor of 3. This advantage of the Bayesian approach is due to the fact that it is less liable to the sampling error in consecutive sampling. These results suggest that a carefully designed statistical procedure may reduce the need for the costly generation of large hindcast datasets.


2020 ◽  
Author(s):  
Luisa-Bianca Thiele ◽  
Ross Pidoto ◽  
Uwe Haberlandt

<p>For derived flood frequency analyses, stochastic rainfall models can be linked with rainfall-runoff models to improve the accuracy of design flood estimations when the length of observed rainfall and runoff data is not sufficient. In the past, when using stochastic rainfall time series for hydrological modelling purposes, catchment rainfall for use in hydrological modelling was calculated from the multiple point rainfall time series. As an alternative to this approach, it will be tested whether catchment rainfall can be modelled directly, negating the drawbacks (and need) encountered in generating spatially consistent time series. An Alternating Renewal rainfall model (ARM) will be used to generate multiple point and lumped catchment rainfall time series in hourly resolution. The generated rainfall time series will be used to drive the rainfall-runoff model HBV-IWW with an hourly time step for mesoscale catchments in Germany. Validation will be performed by comparing modelled runoff regarding runoff and flood statistics using stochastically generated lumped catchment rainfall versus multiple point rainfall. It would be advantageous if the results based on catchment rainfall are comparable to those using multiple point rainfall, so catchment rainfall could be generated directly with the stochastic rainfall models. Extremes at the catchment scale may also be better represented if catchment rainfall is generated directly.</p>


2013 ◽  
Vol 63 (2) ◽  
Author(s):  
Fadhilah Yusof ◽  
Ibrahim Lawal Kane ◽  
Zulkifli Yusop

The dependence structure of rainfall is usually very complex both in time and space. It is shown in this paper that the daily rainfall series of Ipoh and Alorsetar are affected by nonlinear characteristics of the variance often referred to as variance clustering or volatility, where large changes tend to follow large changes and small changes tend to follow small changes. In most empirical modeling of hydrological time series, the focus was on modeling and predicting the mean behavior of the time series through conventional methods of an Autoregressive Moving Average (ARMA) modeling proposed by the Box Jenkins methodology. The conventional models operate under the assumption that the series is stationary that is: constant mean and either constant variance or season-dependent variances, however, does not take into account the second order moment or conditional variance, but they form a good starting point for time series analysis. The residuals from preliminary ARIMA models derived from the daily rainfall time series were tested for ARCH behavior. The autocorrelation structure of the residuals and the squared residuals were inspected, the residuals are uncorrelated but the squared residuals show autocorrelation, the Ljung-Box test confirmed the results. McLeod-Li test and a test based on the Lagrange multiplier (LM) principle were applied to the squared residuals from ARIMA models. The results of these auxiliary tests show clear evidence to reject the null hypothesis of no ARCH effect. Hence indicates that GARCH modeling is necessary. Therefore the composite ARIMA-GARCH model captures the dynamics of the daily rainfall series in study areas more precisely. On the other hand, Seasonal ARIMA model became a suitable model for the monthly average rainfall series of the same locations treated.


2010 ◽  
Vol 14 (12) ◽  
pp. 2559-2575 ◽  
Author(s):  
R. Deidda

Abstract. Previous studies indicate the generalized Pareto distribution (GPD) as a suitable distribution function to reliably describe the exceedances of daily rainfall records above a proper optimum threshold, which should be selected as small as possible to retain the largest sample while assuring an acceptable fitting. Such an optimum threshold may differ from site to site, affecting consequently not only the GPD scale parameter, but also the probability of threshold exceedance. Thus a first objective of this paper is to derive some expressions to parameterize a simple threshold-invariant three-parameter distribution function which assures a perfect overlapping with the GPD fitted on the exceedances over any threshold larger than the optimum one. Since the proposed distribution does not depend on the local thresholds adopted for fitting the GPD, it is expected to reflect the on-site climatic signature and thus appears particularly suitable for hydrological applications and regional analyses. A second objective is to develop and test the Multiple Threshold Method (MTM) to infer the parameters of interest by using exceedances over a wide range of thresholds applying again the concept of parameters threshold-invariance. We show the ability of the MTM in fitting historical daily rainfall time series recorded with different resolutions and with a significative percentage of heavily quantized data. Finally, we prove the supremacy of the MTM fit against the standard single threshold fit, often adopted for partial duration series, by evaluating and comparing the performances on Monte Carlo samples drawn by GPDs with different shape and scale parameters and different discretizations.


2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Alefu Chinasho ◽  
Bobe Bedadi ◽  
Tesfaye Lemma ◽  
Tamado Tana ◽  
Tilahun Hordofa ◽  
...  

Meteorological stations, mainly located in developing countries, have gigantic missing values in the climate dataset (rainfall and temperature). Ignoring the missing values from analyses has been used as a technique to manage it. However, it leads to partial and biased results in data analyses. Instead, filling the data gaps using the reference datasets is a better and widely used approach. Thus, this study was initiated to evaluate the seven gap-filling techniques in daily rainfall datasets in five meteorological stations of Wolaita Zone and the surroundings in South Ethiopia. The considered gap-filling techniques in this study were simple arithmetic means (SAM), normal ratio method (NRM), correlation coefficient weighing (CCW), inverse distance weighting (IDW), multiple linear regression (MLR), empirical quantile mapping (EQM), and empirical quantile mapping plus (EQM+). The techniques were preferred because of their computational simplicity and appreciable accuracies. Their performance was evaluated against mean absolute error (MAE), root mean square error (RMSE), skill scores (SS), and Pearson’s correlation coefficients (R). The results indicated that MLR outperformed other techniques in all of the five meteorological stations. It showed the lowest RMSE and the highest SS and R in all stations. Four techniques (SAM, NRM, CCW, and IDW) showed similar performance and were second-ranked in all of the stations with little exceptions in time series. EQM+ improved (not substantial) the performance levels of gap-filling techniques in some stations. In general, MLR is suggested to fill in the missing values of the daily rainfall time series. However, the second-ranked techniques could also be used depending on the required time series (period) of each station. The techniques have better performance in stations located in higher altitudes. The authors expect a substantial contribution of this paper to the achievement of sustainable development goal thirteen (climate action) through the provision of gap-filling techniques with better accuracy.


2021 ◽  
Author(s):  
Alexis Neven ◽  
Valentin Dall'Alba ◽  
Przemysław Juda ◽  
Julien Straubhaar ◽  
Philippe Renard

Abstract. Ground Penetrating Radar (GPR) is nowadays widely used for determining glacier thickness. However, this method provides thickness data only along the acquisition lines and therefore interpolation has to be made between them. Depending on the interpolation strategy, calculated ice volumes can differ and can lack an accurate error estimation. Furthermore, glacial basal topography is often characterized by complex geomorphological features, which can be hard to reproduce using classical interpolation methods, especially when the conditioning data are sparse or when the morphological features are too complex. This study investigates the applicability of multiple-point statistics (MPS) simulations to interpolate glacier bedrock topography using GPR measurements. In 2018, a dense GPR data set was acquired on the Tsanfleuron Glacier (Switzerland). The results obtained with the direct sampling MPS method are compared against those obtained with kriging and sequential Gaussian simulations (SGS) on both a synthetic data set – with known reference volume and bedrock topography – and the real data underlying the Tsanfleuron glacier. Using the MPS modelled bedrock, the ice volume for the Scex Rouge and Tsanfleuron Glacier is estimated to be 113.9 ± 1.6 Mio m3. The direct sampling approach, unlike the SGS and the kriging, allowed not only an accurate volume estimation but also the generation of a set of realistic bedrock simulations. The complex karstic geomorphological features are reproduced, and can be used to significantly improve for example the precision of under-glacial flow estimation.


Sign in / Sign up

Export Citation Format

Share Document