Predictions of Extreme Values of Significant Wave Height

Author(s):  
C. Guedes Soares ◽  
R. G. Ferreira ◽  
Manuel G. Scotto

This paper provides an overview of different methods of extrapolating environmental data to low probability levels based on the extreme value theory. It discusses the Annual Maxima method and the Peak Over Threshold method, using unified terminology and notation. Furthermore, it describes a method based on the r largest order statistics that has the advantage of providing more accurate parameters and quantile estimates than the Annual Maxima method. Several examples illustrate the methodology and reveal strengths and weaknesses of the various approaches.

2017 ◽  
Vol 47 (3) ◽  
pp. 895-917 ◽  
Author(s):  
Joan del Castillo ◽  
Jalila Daoudi ◽  
Isabel Serra

AbstractIn this paper, we introduce the simplest exponential dispersion model containing the Pareto and exponential distributions. In this way, we obtain distributions with support (0, ∞) that in a long interval are equivalent to the Pareto distribution; however, for very high values, decrease like the exponential. This model is useful for solving relevant problems that arise in the practical use of extreme value theory. The results are applied to two real examples, the first of these on the analysis of aggregate loss distributions associated to the quantitative modelling of operational risk. The second example shows that the new model improves adjustments to the destructive power of hurricanes, which are among the major causes of insurance losses worldwide.


2019 ◽  
Vol 42 (2) ◽  
pp. 143-166 ◽  
Author(s):  
Renato Santos Silva ◽  
Fernando Ferraz Nascimento

Extreme Value Theory (EVT) is an important tool to predict efficient gains and losses. Its main areas of analyses are economic and environmental. Initially, for that form of event, it was developed the use of patterns of parametric distribution such as Normal and Gamma. However, economic and environmental data presents, in most cases, a heavy-tailed distribution, in contrast to those distributions. Thus, it was faced a great difficult to frame extreme events. Furthermore, it was almost impossible to use conventional models, making predictions about non-observed events, which exceed the maximum of observations. In some situations EVT is used to analyse only the maximum of some dataset, which provide few observations, and in those cases it is more effective to use the r largest-order statistics. This paper aims to propose Bayesian estimators' for parameters of the r largest-order statistics. During the research, it was used Monte Carlo simulation to analyze the data, and it was observed some properties of those estimators, such as mean, variance, bias and Root Mean Square Error (RMSE). The estimation of the parameters provided inference for its parameters and return levels. This paper also shows a procedure to the choice of the r-optimal to the r largest-order statistics, based on the Bayesian approach applying Markov chains Monte Carlo (MCMC). Simulation results reveal that the Bayesian approach has a similar performance to the Maximum Likelihood Estimation, and the applications were developed using the Bayesian approach and showed a gain in accurary compared with otherestimators.


2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Ghulam Raza Khan ◽  
Alanazi Talal Abdulrahman ◽  
Osama Alamri ◽  
Zahid Iqbal ◽  
Maqsood Ahmad

Extreme value theory (EVT) is useful for modeling the impact of crashes or situations of extreme stress on investor portfolios. EVT is mostly utilized in financial modeling, risk management, insurance, and hydrology. The price of gold fluctuates considerably over time, and this introduces a risk on its own. The goal of this study is to analyze the risk of gold investment by applying the EVT to historical daily data for extreme daily losses and gains in the price of gold. We used daily gold prices in the Pakistan Bullion Market from August 1, 2011 to July 30, 2021. This paper covers two methods such as Block Maxima (BM) and Peak Over Threshold (POT) modeling. The risk measures which are adopted in this paper are Value at Risk (VaR) and Expected Shortfall (ES). The point and interval estimates of VaR and ES are obtained by fitting the Generalized Pareto (GPA) distribution. Moreover, in this paper, return-level forecasting is also included for the next 5 and 10 years by analyzing the Generalized Extreme Value (GEV) distribution.


Author(s):  
Jiqing Li ◽  
Jing Huang ◽  
Jianchang Li

Abstract. The time-varying design flood can make full use of the measured data, which can provide the reservoir with the basis of both flood control and operation scheduling. This paper adopts peak over threshold method for flood sampling in unit periods and Poisson process with time-dependent parameters model for simulation of reservoirs time-varying design flood. Considering the relationship between the model parameters and hypothesis, this paper presents the over-threshold intensity, the fitting degree of Poisson distribution and the design flood parameters are the time-varying design flood unit period and threshold discriminant basis, deduced Longyangxia reservoir time-varying design flood process at 9 kinds of design frequencies. The time-varying design flood of inflow is closer to the reservoir actual inflow conditions, which can be used to adjust the operating water level in flood season and make plans for resource utilization of flood in the basin.


2021 ◽  
Vol 7 (3) ◽  
pp. 4472-4484
Author(s):  
Wen Chao ◽  

<abstract><p>Catastrophe reinsurance is an important way to prevent and resolve catastrophe risks. As a consequence, the pricing of catastrophe reinsurance becomes a core problem in catastrophic risk management field. Due to the severity of catastrophe loss, the Peak Over Threshold (POT) model in extreme value theory (EVT) is extensively applied to capture the tail characteristics of catastrophic loss distribution. However, there is little research available on the pricing formula of catastrophe excess of loss (Cat XL) reinsurance when the catastrophe loss is modeled by POT. In the context of POT model, we distinguish three different relations between retention and threshold, and then prove the explicit pricing formula respectively under the standard deviation premium principle. Furthermore, we fit POT model to the earthquake loss data in China during 1990–2016. Finally, we give the prices of earthquake reinsurance for different retention cases. The computational results illustrate that the pricing formulas obtained in this paper are valid and can provide basis for the pricing of Cat XL reinsurance contracts.</p></abstract>


2015 ◽  
Vol 60 (206) ◽  
pp. 87-116 ◽  
Author(s):  
Julija Cerovic ◽  
Vesna Karadzic

The concept of Value at Risk(VaR) estimates the maximum loss of a financial position at a given time for a given probability. This paper considers the adequacy of the methods that are the basis of extreme value theory in the Montenegrin emerging market before and during the global financial crisis. In particular, the purpose of the paper is to investigate whether the peaks-over-threshold method outperforms the block maxima method in evaluation of Value at Risk in emerging stock markets such as the Montenegrin market. The daily return of the Montenegrin stock market index MONEX20 is analyzed for the period January 2004 - February 2014. Results of the Kupiec test show that the peaks-over-threshold method is significantly better than the block maxima method, but both methods fail to pass the Christoffersen independence test and joint test due to the lack of accuracy in exception clustering when measuring Value at Risk. Although better, the peaks-over-threshold method still cannot be treated as an accurate VaR model for the Montenegrin frontier stock market.


2007 ◽  
Vol 10 (06) ◽  
pp. 1043-1075 ◽  
Author(s):  
CARLO MARINELLI ◽  
STEFANO D'ADDONA ◽  
SVETLOZAR T. RACHEV

We compare in a backtesting study the performance of univariate models for Value-at-Risk (VaR) and expected shortfall based on stable laws and on extreme value theory (EVT). Analyzing these different approaches, we test whether the sum–stability assumption or the max–stability assumption, that respectively imply α–stable laws and Generalized Extreme Value (GEV) distributions, is more suitable for risk management based on VaR and expected shortfall. Our numerical results indicate that α–stable models tend to outperform pure EVT-based methods (especially those obtained by the so-called block maxima method) in the estimation of Value-at-Risk, while a peaks-over-threshold method turns out to be preferable for the estimation of expected shortfall. We also find empirical evidence that some simple semiparametric EVT-based methods perform well in the estimation of VaR.


2020 ◽  
Author(s):  
Nikos Koutsias ◽  
Frank A. Coutelieris

&lt;p&gt;A statistical analysis on the wildfire events, that have taken place in Greece during the period 1985-2007, for the assessment of the extremes has been performed. The total burned area of each fire was considered here as a key variable to express the significance of a given event. The data have been analyzed through the extreme value theory, which has been in general proved a powerful tool for the accurate assessment of the return period of extreme events. Both frequentist and Bayesian approaches have been used for comparison and evaluation purposes. Precisely, the Generalized Extreme Value (GEV) distribution along with Peaks over Threshold (POT) have been compared with the Bayesian Extreme Value modelling. Furthermore, the correlation of the burned area with the potential extreme values for other key parameters (e.g. wind, temperature, humidity, etc.) has been also investigated.&lt;/p&gt;


Sign in / Sign up

Export Citation Format

Share Document