scholarly journals Simulation of Extreme Heatwaves with Empirical Importance Sampling

2019 ◽  
Author(s):  
Pascal Yiou ◽  
Aglaé Jézéquel

Abstract. Simulating ensembles of extreme events is a necessary task to evaluate their probability distribution and analyse their meteorological properties. Algorithms of importance sampling have provided a way to simulate trajectories of dynamical systems (like climate models) that yield extreme behavior, like heatwaves. Such algorithms also give access to the return periods of such events. We present an adaptation based on circulation analogues of importance sampling to provide a data-based algorithm that simulates extreme events like heatwaves in a realistic way. This algorithm is a modification of a stochastic weather generator, which gives more weight to trajectories with higher temperatures. This presentation outlines the methodology on European heatwaves and illustrates the spatial and temporal properties of simulations.

2020 ◽  
Vol 13 (2) ◽  
pp. 763-781
Author(s):  
Pascal Yiou ◽  
Aglaé Jézéquel

Abstract. Simulating ensembles of extreme events is a necessary task to evaluate their probability distribution and analyze their meteorological properties. Algorithms of importance sampling have provided a way to simulate trajectories of dynamical systems (like climate models) that yield extreme behavior, like heat waves. Such algorithms also give access to the return periods of such events. We present an adaptation based on circulation analogues of importance sampling to provide a data-based algorithm that simulates extreme events like heat waves in a realistic way. This algorithm is a modification of a stochastic weather generator, which gives more weight to trajectories with higher temperatures. This presentation outlines the methodology using European heat waves and illustrates the spatial and temporal properties of simulations.


Water ◽  
2019 ◽  
Vol 11 (9) ◽  
pp. 1896 ◽  
Author(s):  
Gabriel-Martin ◽  
Sordo-Ward ◽  
Garrote ◽  
García

This paper focuses on proposing the minimum number of storms necessary to derive the extreme flood hydrographs accurately through event-based modelling. To do so, we analyzed the results obtained by coupling a continuous stochastic weather generator (the Advanced WEather GENerator) with a continuous distributed physically-based hydrological model (the TIN-based real-time integrated basin simulator), and by simulating 5000 years of hourly flow at the basin outlet. We modelled the outflows in a basin named Peacheater Creek located in Oklahoma, USA. Afterwards, we separated the independent rainfall events within the 5000 years of hourly weather forcing, and obtained the flood event associated to each storm from the continuous hourly flow. We ranked all the rainfall events within each year according to three criteria: Total depth, maximum intensity, and total duration. Finally, we compared the flood events obtained from the continuous simulation to those considering the N highest storm events per year according to the three criteria and by focusing on four different aspects: Magnitude and recurrence of the maximum annual peak-flow and volume, seasonality of floods, dependence among maximum peak-flows and volumes, and bivariate return periods. The main results are: (a) Considering the five largest total depth storms per year generates the maximum annual peak-flow and volume, with a probability of 94% and 99%, respectively and, for return periods higher than 50 years, the probability increases to 99% in both cases; (b) considering the five largest total depth storms per year the seasonality of flood is reproduced with an error of less than 4% and (c) bivariate properties between the peak-flow and volume are preserved, with an error on the estimation of the copula fitted of less than 2%.


2016 ◽  
Vol 29 (5) ◽  
pp. 1605-1615 ◽  
Author(s):  
Jan Rajczak ◽  
Sven Kotlarski ◽  
Christoph Schär

Abstract Climate impact studies constitute the basis for the formulation of adaptation strategies. Usually such assessments apply statistically postprocessed output of climate model projections to force impact models. Increasingly, time series with daily resolution are used, which require high consistency, for instance with respect to transition probabilities (TPs) between wet and dry days and spell durations. However, both climate models and commonly applied statistical tools have considerable uncertainties and drawbacks. This paper compares the ability of 1) raw regional climate model (RCM) output, 2) bias-corrected RCM output, and 3) a conventional weather generator (WG) that has been calibrated to match observed TPs to simulate the sequence of dry, wet, and very wet days at a set of long-term weather stations across Switzerland. The study finds systematic biases in TPs and spell lengths for raw RCM output, but a substantial improvement after bias correction using the deterministic quantile mapping technique. For the region considered, bias-corrected climate model output agrees well with observations in terms of TPs as well as dry and wet spell durations. For the majority of cases (models and stations) bias-corrected climate model output is similar in skill to a simple Markov chain stochastic weather generator. There is strong evidence that bias-corrected climate model simulations capture the atmospheric event sequence more realistically than a simple WG.


2017 ◽  
Vol 10 (12) ◽  
pp. 4563-4575 ◽  
Author(s):  
Jared Lewis ◽  
Greg E. Bodeker ◽  
Stefanie Kremser ◽  
Andrew Tait

Abstract. A method, based on climate pattern scaling, has been developed to expand a small number of projections of fields of a selected climate variable (X) into an ensemble that encapsulates a wide range of indicative model structural uncertainties. The method described in this paper is referred to as the Ensemble Projections Incorporating Climate model uncertainty (EPIC) method. Each ensemble member is constructed by adding contributions from (1) a climatology derived from observations that represents the time-invariant part of the signal; (2) a contribution from forced changes in X, where those changes can be statistically related to changes in global mean surface temperature (Tglobal); and (3) a contribution from unforced variability that is generated by a stochastic weather generator. The patterns of unforced variability are also allowed to respond to changes in Tglobal. The statistical relationships between changes in X (and its patterns of variability) and Tglobal are obtained in a training phase. Then, in an implementation phase, 190 simulations of Tglobal are generated using a simple climate model tuned to emulate 19 different global climate models (GCMs) and 10 different carbon cycle models. Using the generated Tglobal time series and the correlation between the forced changes in X and Tglobal, obtained in the training phase, the forced change in the X field can be generated many times using Monte Carlo analysis. A stochastic weather generator is used to generate realistic representations of weather which include spatial coherence. Because GCMs and regional climate models (RCMs) are less likely to correctly represent unforced variability compared to observations, the stochastic weather generator takes as input measures of variability derived from observations, but also responds to forced changes in climate in a way that is consistent with the RCM projections. This approach to generating a large ensemble of projections is many orders of magnitude more computationally efficient than running multiple GCM or RCM simulations. Such a large ensemble of projections permits a description of a probability density function (PDF) of future climate states rather than a small number of individual story lines within that PDF, which may not be representative of the PDF as a whole; the EPIC method largely corrects for such potential sampling biases. The method is useful for providing projections of changes in climate to users wishing to investigate the impacts and implications of climate change in a probabilistic way. A web-based tool, using the EPIC method to provide probabilistic projections of changes in daily maximum and minimum temperatures for New Zealand, has been developed and is described in this paper.


2017 ◽  
Author(s):  
Jared Lewis ◽  
Greg E. Bodeker ◽  
Andrew Tait ◽  
Stefanie Kremser

Abstract. A method, based on climate pattern-scaling, has been developed to expand a small number of projections of fields of a selected climate variable (X) into an ensemble that encapsulates a wide range of model structural uncertainties. The method described in this paper is referred to as the Ensemble Projections Incorporating Climate model uncertainty (EPIC) method. Each ensemble member is constructed by adding contributions from (1) a climatology derived from observations that represents the time invariant part of the signal, (2) a contribution from forced changes in X where those changes can be statistically related to changes in global mean surface temperature (Tglobal), and (3) a contribution from unforced variability that is generated by a stochastic weather generator. The patterns of unforced variability are also allowed to respond to changes in Tglobal. The statistical relationships between changes in X (and its patterns of variability) with Tglobal are obtained in a "training" phase. Then, in an "implementation" phase, 190 simulations of Tglobal are generated using a simple climate model tuned to emulate 19 different Global Climate Models (GCMs) and 10 different carbon cycle models. Using the generated Tglobal time series and the correlation between the forced changes in X and Tglobal, obtained in the "training" phase, the forced change in the X field can be generated many times using Monte Carlo analysis. A stochastic weather generator model is used to generate realistic representations of weather which include spatial coherence. Because GCMs and Regional Climate Models (RCMs) are less likely to correctly represent unforced variability compared to observations, the stochastic weather generator takes as input measures of variability derived from observations, but also responds to forced changes in climate in a way that is consistent with the RCM projections. This approach to generating a large ensemble of projections is many orders of magnitude more computationally efficient than running multiple GCM or RCM simulations. Such a large ensemble of projections permits a description of a Probability Density Function (PDF) of future climate states rather than a small number of individual story lines within that PDF which may not be representative of the PDF as a whole; the EPIC method corrects for such potential sampling biases. The method is useful for providing projections of changes in climate to users wishing to investigate the impacts and implications of climate change in a probabilistic way. A web-based tool, using the EPIC method to provide probabilistic projections of changes in daily maximum and minimum temperatures for New Zealand, has been developed and is described in this paper.


Author(s):  
S. Ragettli ◽  
X. Tong ◽  
G. Zhang ◽  
H. Wang ◽  
P. Zhang ◽  
...  

Abstract Flood events are difficult to characterize if available observation records are shorter than the recurrence intervals, and the non-stationarity of the climate adds additional uncertainty. In this study, we use a hydrological model coupled with a stochastic weather generator to simulate the summer flood regime in two mountainous catchments located in China and Switzerland. The models are set up with hourly data from only 10–20 years of observations but are successfully validated against 30–40-year long records of flood frequencies and magnitudes. To assess the climate change impacts on flood frequencies, we re-calibrate the weather generator with the climate statistics for 2021–2050 obtained from ensembles of bias-corrected regional climate models. Across all assessed return periods (10–100 years) and two emission scenarios, nearly all model chains indicate an intensification of flood extremes. According to the ensemble averages, the potential flood magnitudes increase by more than 30% in both catchments. The unambiguousness of the results is remarkable and can be explained by three factors rarely combined in previous studies: reduced statistical uncertainty due to a stochastic modelling approach, hourly time steps and the focus on headwater catchments where local topography and convective storms are causing runoff extremes within a confined area.


2011 ◽  
Author(s):  
Enrico Scoccimarro ◽  
Silvio Gualdi ◽  
Antonella Sanna ◽  
Edoardo Bucchignani ◽  
Myriam Montesarchio

2021 ◽  
Vol 5 (1) ◽  
pp. 1-11
Author(s):  
Vitthal Anwat ◽  
Pramodkumar Hire ◽  
Uttam Pawar ◽  
Rajendra Gunjal

Flood Frequency Analysis (FFA) method was introduced by Fuller in 1914 to understand the magnitude and frequency of floods. The present study is carried out using the two most widely accepted probability distributions for FFA in the world namely, Gumbel Extreme Value type I (GEVI) and Log Pearson type III (LP-III). The Kolmogorov-Smirnov (KS) and Anderson-Darling (AD) methods were used to select the most suitable probability distribution at sites in the Damanganga Basin. Moreover, discharges were estimated for various return periods using GEVI and LP-III. The recurrence interval of the largest peak flood on record (Qmax) is 107 years (at Nanipalsan) and 146 years (at Ozarkhed) as per LP-III. Flood Frequency Curves (FFC) specifies that LP-III is the best-fitted probability distribution for FFA of the Damanganga Basin. Therefore, estimated discharges and return periods by LP-III probability distribution are more reliable and can be used for designing hydraulic structures.


2016 ◽  
Vol 23 (6) ◽  
pp. 375-390 ◽  
Author(s):  
Katrin Sedlmeier ◽  
Sebastian Mieruch ◽  
Gerd Schädler ◽  
Christoph Kottmeier

Abstract. Studies using climate models and observed trends indicate that extreme weather has changed and may continue to change in the future. The potential impact of extreme events such as heat waves or droughts depends not only on their number of occurrences but also on "how these extremes occur", i.e., the interplay and succession of the events. These quantities are quite unexplored, for past changes as well as for future changes and call for sophisticated methods of analysis. To address this issue, we use Markov chains for the analysis of the dynamics and succession of multivariate or compound extreme events. We apply the method to observational data (1951–2010) and an ensemble of regional climate simulations for central Europe (1971–2000, 2021–2050) for two types of compound extremes, heavy precipitation and cold in winter and hot and dry days in summer. We identify three regions in Europe, which turned out to be likely susceptible to a future change in the succession of heavy precipitation and cold in winter, including a region in southwestern France, northern Germany and in Russia around Moscow. A change in the succession of hot and dry days in summer can be expected for regions in Spain and Bulgaria. The susceptibility to a dynamic change of hot and dry extremes in the Russian region will probably decrease.


Sign in / Sign up

Export Citation Format

Share Document