scholarly journals Downsizing parameter ensembles for simulations of rare floods

2020 ◽  
Vol 20 (12) ◽  
pp. 3521-3549
Author(s):  
Anna E. Sikorska-Senoner ◽  
Bettina Schaefli ◽  
Jan Seibert

Abstract. For extreme-flood estimation, simulation-based approaches represent an interesting alternative to purely statistical approaches, particularly if hydrograph shapes are required. Such simulation-based methods are adapted within continuous simulation frameworks that rely on statistical analyses of continuous streamflow time series derived from a hydrological model fed with long precipitation time series. These frameworks are, however, affected by high computational demands, particularly if floods with return periods > 1000 years are of interest or if modelling uncertainty due to different sources (meteorological input or hydrological model) is to be quantified. Here, we propose three methods for reducing the computational requirements for the hydrological simulations for extreme-flood estimation so that long streamflow time series can be analysed at a reduced computational cost. These methods rely on simulation of annual maxima and on analysing their simulated range to downsize the hydrological parameter ensemble to a small number suitable for continuous simulation frameworks. The methods are tested in a Swiss catchment with 10 000 years of synthetic streamflow data simulated thanks to a weather generator. Our results demonstrate the reliability of the proposed downsizing methods for robust simulations of rare floods with uncertainty. The methods are readily transferable to other situations where ensemble simulations are needed.

2020 ◽  
Author(s):  
Anna E. Sikorska-Senoner ◽  
Bettina Schaefli ◽  
Jan Seibert

Abstract. For extreme flood estimation, simulation-based approaches represent an interesting alternative to purely statistical approaches, particularly if hydrograph shapes are required. Such simulation-based methods are adapted within continuous simulation frameworks that rely on statistical analyses of continuous streamflow time series derived from a hydrologic model fed with long precipitation time series. These frameworks are, however, affected by high computational demands, particularly if floods with return periods > 1000 years are of interest or if modelling uncertainty due to different sources (meteorological input or hydrologic model) is to be quantified. Here, we propose three methods for reducing the computational requirements for the hydrological simulations for extreme flood estimation, so that long streamflow time series can be analysed at a reduced computational cost. These methods rely on simulation of annual maxima and on analyzing their simulated range to downsize the hydrological parameter ensemble to a small number suitable for continuous simulation frameworks. The methods are tested in a Swiss catchment with 10 000 years of synthetic streamflow data simulated with a weather generator. Our results demonstrate the reliability of the proposed downsizing methods for robust simulations of extreme floods with uncertainty. The methods are readily transferable to other situations where ensemble simulations are needed.


2014 ◽  
Vol 14 (5) ◽  
pp. 1283-1298 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses is simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a bottom–up classification procedure was used to define a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000-year (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, statistical flood frequency analysis based on the annual maximum series, and the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2013 ◽  
Vol 1 (6) ◽  
pp. 6785-6828 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses are simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a "bottom-up" classification procedure was used for defining a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000 yr (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, with statistical flood frequency analysis based on the annual maximum series, and with the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2020 ◽  
Author(s):  
Shima Azimi ◽  
Silvia Barbetta ◽  
Tommaso Moramarco ◽  
Angelica Tarpanelli ◽  
Stefania Camici ◽  
...  

<p>Flood is one of the most frequent disasters which dangerously impacts societies and economies worldwide. Floodplain management and hydraulic risk analysis based on design flood estimation are essential tools to reduce damages and save human lives. Flood Frequency Analysis (FFA) has been classically used to derive design river discharge estimates, however, the scarce availability of discharge observations, especially in small catchments (<150 km2), makes its application not always possible. In addition, with the projections foreseen by the International Panel on Climate Change (IPCC) the use of FFA might lead to incorrect estimates of design river discharge as FFA is based on the concept of stationarity. Generally, long rainfall and temperature time series are much more available than discharge observations but their temporal coverage is often not sufficient for carrying out FFA via a hydrological simulation.</p><p>To handle these drawbacks, the combination of a stochastic generation of rainfall and temperature time series, Regional Circulation Model (RCM) projections and continuous hydrological models provides a reliable tool for obtaining long river discharge time series to implement FFA. However, design flood estimations can be significantly uncertain due to several factors such as 1) the specific model structure, parameterizations and processes representation, 2) the catchment hydrology and 3) the specific climate change scenario.</p><p>The primary objective of this study is to explore the sensitivity of the design river discharge estimates to the hydrological model complexity and parameterization. For this, three continuous hydrological distributed models named the Modello Idrologico SemiDistribuito in continuo (MISDc), the Soil & Water Assessment Tool (SWAT) and GEOFrame NewAGE model are forced with long timeseries of rainfall and temperature obtained via the Neyman-Scott rectangular pulse model (NSRP) for stochastic rainfall generation, and the fractionally differenced ARIMA model (FARIMA) for stochastic temperature generation. A secondary objective is to understand the impact of climate change and the catchment hydrology on the design river discharge estimates via the use of different RCM projections.</p><p>The study is carried in the Upper Nera catchment in Central Italy which was impacted by the recent 2016 earthquake and for which is necessary to identify hydraulic risk mitigation measures and adaptation for a forward planning in the floodplain areas where new settlements will be rebuilt.</p><p>Preliminary results suggest the high dependency of the design river discharge estimates to the chosen hydrological model and a different response of the sub-catchments to the climate change scenario.</p>


2021 ◽  
Vol 5 (1) ◽  
pp. 51
Author(s):  
Enriqueta Vercher ◽  
Abel Rubio ◽  
José D. Bermúdez

We present a new forecasting scheme based on the credibility distribution of fuzzy events. This approach allows us to build prediction intervals using the first differences of the time series data. Additionally, the credibility expected value enables us to estimate the k-step-ahead pointwise forecasts. We analyze the coverage of the prediction intervals and the accuracy of pointwise forecasts using different credibility approaches based on the upper differences. The comparative results were obtained working with yearly time series from the M4 Competition. The performance and computational cost of our proposal, compared with automatic forecasting procedures, are presented.


2017 ◽  
Author(s):  
Federica Pardini ◽  
Mike Burton ◽  
Fabio Arzilli ◽  
Giuseppe La Spina ◽  
Margherita Polacci

Abstract. Quantifying time-series of sulphur dioxide (SO2) emissions during explosive eruptions provides insight into volcanic processes, assists in volcanic hazard mitigation, and permits quantification of the climatic impact of major eruptions. While volcanic SO2 is routinely detected from space during eruptions, the retrieval of plume injection height and SO2 flux time-series remains challenging. Here we present a new numerical method based on forward- and backward-trajectory analyses which enable such time-series to be robustly determined. The method is applied to satellite images of volcanic eruption clouds through the integration of the HYSPLIT software with custom-designed Python routines in a fully automated manner. Plume injection height and SO2 flux time-series are computed with a period of ~ 10 minutes with low computational cost. Using this technique, we investigated the SO2 emissions from two sub-Plinian eruptions of Calbuco, Chile, produced in April 2015. We found a mean injection height above the vent of ~ 15 km for the two eruptions, with overshooting tops reaching ~ 20 km. We calculated a total of 300 ± 46 kt of SO2 released almost equally during both events, with 160 ± 30 kt produced by the first event and 140 ± 35 kt by the second. The retrieved SO2 flux time-series show an intense gas release during the first eruption (average flux of 2560 kt day−1), while a lower SO2 flux profile was seen for the second (average flux 560 kt day−1), suggesting that the first eruption was richer in SO2. This result is exemplified by plotting SO2 flux against retrieved plume height above the vent, revealing distinct trends for the two events. We propose that a pre-erupted exsolved volatile phase was present prior to the first event, which could have led to the necessary overpressure to trigger the eruption. The second eruption, instead, was mainly driven by syneruptive degassing. This hypothesis is supported by melt inclusion measurements of sulfur concentrations in plagioclase phenocrysts and groundmass glass of tephra samples through electron microprobe analysis. This work demonstrates that detailed interpretations of sub-surface magmatic processes during eruptions are possible using satellite SO2 data. Quantitative comparisons of high temporal resolution plume height and SO2 flux time-series offer a powerful tool to examine processes triggering and controlling eruptions. These novel tools open a new frontier in space-based volcanological research, and will be of great value when applied to remote, poorly monitored volcanoes, and to major eruptions that can have regional and global climate implications through, for example, influencing ozone depletion in the stratosphere and light scattering from stratospheric aerosols.


2014 ◽  
Vol 18 (1) ◽  
pp. 353-365 ◽  
Author(s):  
U. Haberlandt ◽  
I. Radtke

Abstract. Derived flood frequency analysis allows the estimation of design floods with hydrological modeling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices regarding precipitation input, discharge output and consequently the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets and to propose the most suitable approach. Event based and continuous, observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output, short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in northern Germany with the hydrological model HEC-HMS (Hydrologic Engineering Center's Hydrologic Modeling System). The results show that (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, and (III) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the application for derived flood frequency analysis.


2017 ◽  
Author(s):  
Maurizio Mazzoleni ◽  
Vivian Juliette Cortes Arevalo ◽  
Uta Wehn ◽  
Leonardo Alfonso ◽  
Daniele Norbiato ◽  
...  

Abstract. Accurate flood predictions are essential to reduce the risk and damages over large urbanized areas. To improve prediction capabilities, hydrological measurements derived by traditional physical sensors are integrated in real-time within mathematic models. Recently, traditional sensors are complemented with low-cost social sensors. However, measurements derived by social sensors (i.e. crowdsourced observations) can be more spatially distributed but less accurate. In this study, we assess the usefulness for model performance of assimilating crowdsourced observations from a heterogeneous network of static physical, static social and dynamic social sensors. We assess potential effects on the model predictions to the extreme flood event occurred in the Bacchiglione catchment on May 2013. Flood predictions are estimated at the target point of Ponte degli Angeli (Vicenza), outlet of the Bacchiglione catchment, by means of a semi-distributed hydrological model. The contribution of the upstream sub-catchment is calculated using a conceptual hydrological model. The flow is propagated along the river reach using a hydraulic model. In both models, a Kalman filter is implemented to assimilate the real-time crowdsourced observations. We synthetically derived crowdsourced observations for either static social or dynamic social sensors because crowdsourced measures were not available. We consider three sets of experiments: (1) only physical sensors are available; (2) probability of receiving crowdsourced observations and (3) realistic scenario of citizen engagement based on population distribution. The results demonstrated the importance of integrating crowdsourced observations. Observations from upstream sub-catchments assimilated into the hydrological model ensures high model performance for high lead time values. Observations next to the outlet of the catchments provide good results for short lead times. Furthermore, citizen engagement level scenarios moved by a feeling of belonging to a community of friends indicated flood prediction improvements when such small communities are located upstream a particular target point. Effective communication and feedback is required between water authorities and citizens to ensure minimum engagement levels and to minimize the intrinsic low-variable accuracy of crowdsourced observations.


Sign in / Sign up

Export Citation Format

Share Document