scholarly journals Does the AO index have predictive power regarding extreme cold temperatures in Europe?

2020 ◽  
Author(s):  
Tamás Bódai ◽  
Torben Schmith

Abstract. With a view to seasonal forecasting of extreme value statistics, we apply the method of Nonstationary extreme value statistics to determine the predictive power of large scale quantities. Regarding winter cold extremes over Europe, we find that the monthly mean daily minimum local temperature – which we call a native co-variate in the present context – has a much larger predictive power than the nonlocal monthly mean Arctic Oscillation index. Our results also prompt that the exploitation of both co-variates is not possible from 70 years-long data sets.

Author(s):  
Martin Arntsen ◽  
Juliane Borge ◽  
Ole-Hermann Strømmesen ◽  
Edmond Hansen

The duration of current measurements is often short, ranging from a few weeks up to a year. Application of extreme value statistics to derive design levels requires relatively long time series. To mitigate the lack of long-term measurements, the Norwegian standard NS9415 for fish farm design requires the design level of 50-year return period to be derived by multiplication of the current maximum in month-long current measurements by a prescribed conversion factor of 1.85. Here we use twelve data sets of yearlong coastal current measurements to explore the validity of this factor. For each yearlong time series, a design level of 50-year return period is calculated by extreme value statistics and used to calculate estimates of the conversion factor. The mean value of the resulting conversion factor is close to that of NS9415, 1.85 and 1.80 at 5 and 15 m depth, respectively. However, the spread in values is great, both geographically and between months. A conversion factor ranging from 1 to 4 reflects different relative dominance of the driving forces at different coastal regions and different seasons. The absence of a significant seasonal cycle in the conversion factors calculated here, illustrates the difficulty in adjusting for season. The results illustrate and quantify the uncertainty and — often — the lack of conservatism in design levels derived from month long current observations.


2013 ◽  
Vol 1 (1) ◽  
pp. 275-322 ◽  
Author(s):  
N. V. Dung ◽  
B. Merz ◽  
A. Bárdossy ◽  
H. Apel

Abstract. In this paper we present a novel approach for flood hazard analysis of the whole Mekong Delta with a particular focus on the Vietnamese part. Based on previous studies identifying the flood regime in the Mekong delta as non-stationary (Delgado et al., 2010), we develop a non-stationary approach for flood hazard analysis. Moreover, the approach is also bi-variate, as the flood severity in the Mekong Delta is determined by both maximum discharge and flood volume, which determines the flood duration. Probabilities of occurrences of peak discharge and flood volume are estimated by a copula. The flood discharges and volumes are used to derive synthetic hydrographs, which in turn constitute the upper boundary condition for a large-scale hydrodynamic model covering the whole Mekong Delta. The hydrodynamic model transforms the hydrographs into hazard maps. In addition, we extrapolate the observed trends in flood peak and volume and their associated non-stationary extreme value distributions to the year 2030 in order to give a flood hazard estimate for the near future. The uncertainty of extreme flood events in terms of different possible combinations of peak discharge and flood volume given by the copula is considered. Also, the uncertainty in flood hydrograph shape is combined with parameter uncertainty of the hydrodynamic model in a Monte Carlo framework yielding uncertainty estimates in terms of quantile flood maps. The proposed methodology sets the frame for the development of probabilistic flood hazard maps for the entire Mekong Delta. The combination of bi-variate, non-stationary extreme value statistics with large-scale flood inundation modeling and uncertainty quantification is novel in itself. Moreover, it is in particular novel for the Mekong Delta: a region where not even a standard hazard analysis based on a univariate, stationary extreme value statistic exists.


2013 ◽  
Vol 10 (1) ◽  
Author(s):  
Helena Penalva ◽  
Manuela Neves

The statistical Extreme Value Theory has grown gradually from the beginning of the 20th century. Its unquestionable importance in applications was definitely recognized after Gumbel's book in 1958, Statistics of Extremes. Nowadays there is a wide number of applied sciences where extreme value statistics are largely used. So, accurately modeling extreme events has become more and more important and the analysis requires tools that must be simple to use but also should consider complex statistical models in order to produce valid inferences. To deal with accurate, friendly, free and open-source software is of great value for practitioners and researchers. This paper presents a review of the main steps for initializing a data analysis of extreme values in R environment. Some well documented packages are briefly described and two data sets will be considered for illustrating the use of some functions.


2018 ◽  
Vol 05 (01) ◽  
pp. 1750005
Author(s):  
Swarna Khare ◽  
Zaid Chalabi ◽  
Ben Youngman

Cold temperature extremes can have a detrimental effect on human health, public services and the economy of a country. From a public health services perspective, it is important to quantify the frequency of occurrence of extreme cold events and how this frequency changes over time in order to develop cost-effective anticipatory plans to reduce the potential impact of cold extremes on the exposed vulnerable population. Using non-stationary extreme-value analysis, the geographical and temporal distribution of cold temperature extremes over the last 160 years in several locations in England and Scotland was investigated. The temperature data were obtained from weather stations. It is then shown that the 5, 10, 50 and 100 year return levels of minimum winter temperature have increased throughout the 20th century. It was also shown that the probability of experiencing extreme cold temperatures has become very low in most locations particularly in years with a positive phase of the North Atlantic Oscillation (NAO) index. Finally, an estimate of the approximate financial risk to the UK economy of consecutive days of extreme cold temperatures is presented.


2018 ◽  
Author(s):  
Eva Steirou ◽  
Lars Gerlitz ◽  
Heiko Apel ◽  
Xun Sun ◽  
Bruno Merz

Abstract. The link between streamflow extremes and climatology has been widely studied during the last decades. However, a study investigating the effect of large-scale circulation variations on the distribution of seasonal discharge extremes at the European level is missing. Here we fit a climate-informed Generalized Extreme Value distribution (GEV) to about 600 streamflow records in Europe for each of the standard seasons, i.e. to winter, spring, summer and autumn maxima, and compare it with the classical GEV with parameters invariant in time. The study adopts a Bayesian framework and covers the period 1950 to 2016. Five indices with proven influence on the European climate are examined independently as covariates, namely the North Atlantic Oscillation (NAO), the East Atlantic pattern (EA), the East Atlantic / West Russian pattern (EA/WR), the Scandinavia pattern (SCA) and the Polar-Eurasian pattern (POL). It is found that for a high percentage of stations the climate-informed model is preferred to the classical model, a result that provides evidence towards an improvement of the estimation of flood probabilities. Particularly for NAO during winter, a strong influence on streamflow extremes is detected for large parts of Europe (preferred to the classical GEV for 44 % of the stations). Climate-informed fits are characterized by spatial coherence and form patterns that resemble relations between the climate indices and seasonal precipitation, suggesting a prominent role of the considered circulation modes for flood generation. For certain regions, such as Northwest Scandinavia and the British Isles, variations of the climate indices result in considerably different extreme value distributions and thus in highly different flood estimates for individual years. Plots of extreme streamflow with a probability of exceedance of 0.01 indicate that the deviation between the classical and climate-informed analysis concerns single years but can also persist for longer time periods.


2018 ◽  
Vol 21 (2) ◽  
pp. 117-124 ◽  
Author(s):  
Bakhtyar Sepehri ◽  
Nematollah Omidikia ◽  
Mohsen Kompany-Zareh ◽  
Raouf Ghavami

Aims & Scope: In this research, 8 variable selection approaches were used to investigate the effect of variable selection on the predictive power and stability of CoMFA models. Materials & Methods: Three data sets including 36 EPAC antagonists, 79 CD38 inhibitors and 57 ATAD2 bromodomain inhibitors were modelled by CoMFA. First of all, for all three data sets, CoMFA models with all CoMFA descriptors were created then by applying each variable selection method a new CoMFA model was developed so for each data set, 9 CoMFA models were built. Obtained results show noisy and uninformative variables affect CoMFA results. Based on created models, applying 5 variable selection approaches including FFD, SRD-FFD, IVE-PLS, SRD-UVEPLS and SPA-jackknife increases the predictive power and stability of CoMFA models significantly. Result & Conclusion: Among them, SPA-jackknife removes most of the variables while FFD retains most of them. FFD and IVE-PLS are time consuming process while SRD-FFD and SRD-UVE-PLS run need to few seconds. Also applying FFD, SRD-FFD, IVE-PLS, SRD-UVE-PLS protect CoMFA countor maps information for both fields.


Sign in / Sign up

Export Citation Format

Share Document