scholarly journals Reinsurance Pricing of Large Motor Insurance Claims in Nigeria: An Extreme Value Analysis

2019 ◽  
Vol 8 (4) ◽  
pp. 1
Author(s):  
Queensley C. Chukwudum

Reinsurance is of utmost importance to insurers because it enables insurance companies cover risks that they, under normal circumstances, would not be able to cover on their own. An insurer needs to be able to evaluate his solvency probability and consequently, adjust his retention levels appropriately because the insurer’s retention level plays a vital role in determining the premiums he will pay to the reinsurer. To illustrate how Extreme Value theory can be applied, this study delves into modelling the probabilistic behaviour of the frequency and severity of large motor claims from the Nigerian insurance sector (2013-2016) using the Negative Binomial-Generalized Pareto distribution (NB-GPD). The annual loss distribution is simulated using the Monte Carlo method and it is used to predict the expected annual total claims and estimate the capital requirement for a year. Pricing of the Excess-of-loss (XL) reinsurance is also examined to aid insurers in optimizing their risk management decision in regards to the choice of their risk transfer position.

2021 ◽  
Author(s):  
Anne Dutfoy ◽  
Gloria Senfaute

Abstract Probabilistic Seismic Hazard Analysis (PSHA) procedures require that at least the mean activity rate be known, as well as the distribution of magnitudes. Within the Gutenberg-Richter assumption, that distribution is an Exponential distribution, upperly truncated to a maximum possible magnitude denoted $m_{max}$. This parameter is often fixed from expert judgement under tectonics considerations, due to a lack of universal method. In this paper, we propose two innovative alternatives to the Gutenberg-Richter model, based on the Extreme Value Theory and that don't require to fix a priori the value of $m_{max}$: the first one models the tail distribution magnitudes with a Generalized Pareto Distribution; the second one is a variation on the usual Gutenberg-Richter model where $m_{max}$ is a random variable that follows a distribution defined from an extreme value analysis. We use the maximum likelihood estimators taking into account the unequal observation spans depending on magnitude, the incompleteness threshold of the catalog and the uncertainty in the magnitude value itself. We apply these new recurrence models on the data observed in the Alps region, in the south of France and we integrate them into a probabilistic seismic hazard calculation to evaluate their impact on the seismic hazard levels. The proposed new recurrence models introduce a reduction of the seismic hazard level compared to the common Gutenberg-Richter model conventionally used for PSHA calculations. This decrease is significant for all frequencies below 10 Hz, mainly at the lowest frequencies and for very long return periods. To our knowledge, both new models have never been used in a probabilistic seismic hazard calculation and constitute a new promising generation of recurrence models.


2007 ◽  
Vol 46 (10) ◽  
pp. 1501-1522 ◽  
Author(s):  
Allan W. MacAfee ◽  
Samuel W. K. Wong

Abstract Many of the extreme ocean wave events generated by tropical cyclones (TCs) can be explained by examining one component of the spectral wave field, the trapped-fetch wave (TFW). Using a Lagrangian TFW model, a parametric model representation of the local TC wind fields, and the National Hurricane Center’s hurricane database archive, a dataset of TFWs was created from all TCs in the Atlantic Ocean, Gulf of Mexico, and Caribbean Sea from 1851 to 2005. The wave height at each hourly position along a TFW trajectory was sorted into 2° × 2° latitude–longitude grid squares. Five grid squares (north of Hispaniola, Gulf of Mexico, Carolina coast, south of Nova Scotia, and south of Newfoundland) were used to determine if extreme value theory could be applied to the extremes in the TFW dataset. The statistical results justify accepting that a generalized Pareto distribution (GPD) model with a threshold of 6 m could be fitted to the data: the datasets were mostly modeled adequately, and much of the output information was useful. Additional tests were performed by sorting the TFW data into the marine areas in Atlantic Canada, which are of particular interest to the Meteorological Service of Canada because of the high ocean traffic, offshore drilling activities, and commercial fishery. GPD models were fitted, and return periods and the 95% confidence intervals (CIs) for 10-, 15-, and 20-m return levels were computed. The results further justified the use of the GPD model; hence, extension to the remaining grid squares was warranted. Of the 607 grid squares successfully modeled, the percentage of grid squares with finite lower (upper) values for the 10-, 15-, and 20-m return level CIs were 100 (80), 94 (53), and 90 (16), respectively. The lower success rate of 20-m TFW CIs was expected, given the rarity of 20-m TFWs: of the 5 713 625 hourly TFW points, only 13 958, or 0.24%, were 20 m or higher. Overall, the distribution of the successfully modeled grid squares in the data domain agreed with TFW theory and TC climatology. As a direct result of this study, the summary datasets and return level plots were integrated into application software for use by risk managers. A description of the applications illustrates their use in addressing various questions on extreme TFWs.


2015 ◽  
Vol 76 (1) ◽  
Author(s):  
Nor Azrita Mohd Amin ◽  
Mohd Bakri Adam ◽  
Ahmad Zaharin Aris

Extreme value theory is a very well-known statistical analysis for modeling extreme data in environmental management. The main focus is to compare the generalized extreme value distribution (GEV) and the generalized Pareto distribution (GPD) for modeling extreme data in terms of estimated parameters and return levels. The maximum daily PM10 data for Johor Bahru monitoring station based on a 14 years database (1997-2010) were analyzed. It is found that the parameters estimated are more comparable if the extracted numbers of extreme series for both models are much more similar. The 10-years return value for GEV is  while for GPD is . Based on the threshold choice plot, threshold  is chosen and the corresponding 10-years return level is . According to the air pollution index in Malaysia, this value is categorized as hazardous.


2021 ◽  
Author(s):  
Katharina Klehmet ◽  
Peter Berg ◽  
Denica Bozhinova ◽  
Louise Crochemore ◽  
Ilias Pechlivanidis ◽  
...  

<p>Robust information of hydrometeorological extremes is important for effective risk management, mitigation and adaptation measures by public authorities, civil and engineers dealing for example with water management. Typically, return values of certain variables, such as extreme precipitation and river discharge, are of particular interest and are modelled statistically using Extreme Value Theory (EVT). However, the estimation of these rare events based on extreme value analysis are affected by short observational data records leading to large uncertainties.</p><p>In order to overcome this limitation, we propose to use the latest seasonal meteorological prediction system of the European Centre for Medium-Range Weather Forecasts (ECMWF SEAS5) and seasonal hydrological forecasts generated with the pan-European E-HYPE model of the original period 1993-2015 and to extend the dataset to longer synthetic time series by pooling single forecast months to surrogate years. To ensure an independent dataset, the seasonal forecast skill is assessed in advance and months (and lead months) with positive skill are excluded. In this study, we simplify the method and work with samples of 6- and 4-month forecasts (instead of the full 7-month forecasts) depending on the statistical independency of the variables. It enables the record to be extended from the original 23 years to 3450 and 2300 surrogate years for the 6- and 4-month forecasts respectively.</p><p>Furthermore, we investigate the robustness of estimated 50- and 100-year return values for extreme precipitation and river discharge using 1-year block maxima that are fitted to the Generalized Extreme Value distribution. Surrogate sets of pooled years are randomly constructed using the Monte-Carlo approach and different sample sizes are chosen. This analysis reveals a considerable reduction in the uncertainty of all return period estimations for both variables for selected locations across Europe using a sample size of 500 years. This highlights the potential in using the ensembles of meteorological and hydrological seasonal forecasts to obtain timeseries of sufficient length and minimize the uncertainty in the extreme value analysis.</p>


2020 ◽  
Author(s):  
Nikos Koutsias ◽  
Frank A. Coutelieris

<p>A statistical analysis on the wildfire events, that have taken place in Greece during the period 1985-2007, for the assessment of the extremes has been performed. The total burned area of each fire was considered here as a key variable to express the significance of a given event. The data have been analyzed through the extreme value theory, which has been in general proved a powerful tool for the accurate assessment of the return period of extreme events. Both frequentist and Bayesian approaches have been used for comparison and evaluation purposes. Precisely, the Generalized Extreme Value (GEV) distribution along with Peaks over Threshold (POT) have been compared with the Bayesian Extreme Value modelling. Furthermore, the correlation of the burned area with the potential extreme values for other key parameters (e.g. wind, temperature, humidity, etc.) has been also investigated.</p>


2018 ◽  
Vol 150 ◽  
pp. 05025
Author(s):  
Nor Azrita Mohd Amin ◽  
Siti Aisyah Zakaria

The main concern in environmental issue is on extreme phenomena (catastrophic) instead of common events. However, most statistical approaches are concerned primarily with the centre of a distribution or on the average value rather than the tail of the distribution which contains the extreme observations. The concept of extreme value theory affords attention to the tails of distribution where standard models are proved unreliable to analyse extreme series. High level of particulate matter (PM10) is a common environmental problem which causes various impacts to human health and material damages. If the main concern is on extreme events, then extreme value analysis provides the best result with significant evidence. The monthly average and monthly maxima PM10 data for Perlis from 2003 to 2014 were analysed. Forecasting for average data is made by Holt-Winters method while return level determine the predicted value of extreme events that occur on average once in a certain period. The forecasting from January 2015 to December 2016 for average data found that the highest forecasted value is 58.18 (standard deviation 18.45) on February 2016 while return level achieved 253.76 units for 24 months (2015-2016) return periods.


2018 ◽  
Vol 11 (2) ◽  
Author(s):  
Piotr Kokoszka ◽  
Hong Miao ◽  
Stilian Stoev ◽  
Ben Zheng

Abstract Motivated by the risk inherent in intraday investing, we propose several ways of quantifying extremal behavior of a time series of curves. A curve can be extreme if it has shape and/or magnitude much different than the bulk of observed curves. Our approach is at the nexus of functional data analysis and extreme value theory. The risk measures we propose allow us to assess probabilities of observing extreme curves not seen in a historical record. These measures complement risk measures based on point-to-point returns, but have different interpretation and information content. Using our approach, we study how the financial crisis of 2008 impacted the extreme behavior of intraday cumulative return curves. We discover different impacts on shares in important sectors of the US economy. The information our analysis provides is in some cases different from the conclusions based on the extreme value analysis of daily closing price returns.


2021 ◽  
Author(s):  
Frank Kwasniok

<p>Traditional extreme value analysis based on the generalised extreme value (GEV) or the generalised Pareto distribution (GPD) suffers from two drawbacks: (i) Both methods are wasteful of data as only block maxima or exceedances over a high threshold are used and the bulk of the data is disregarded, resulting in a large uncertainty in the tail inference. (ii) In the peak-over-threshold approach the choice of the threshold is often difficult in practice as there are no really objective underlying criteria.<br>Here, two approaches based on maximum likelihood estimation are introduced which simultaneously model the whole distribution range and thus constrain the tail inference by information from the bulk data. Firstly, the bulk matching method models the bulk of the distribution with a flexible exponential family model and the tail with a GPD. The two distributions are linked together at the threshold with appropriate matching conditions. The threshold can be estimated in an outer loop also based on the likelihood function. Secondly, in the extended generalised Pareto distribution (EGPD) model for non-negative variables the whole distribution is modelled with a GPD overlaid with a transition probability density which is again represented by an exponential family. Appropriate conditions ensure that the model is in accordance with extreme value theory both for the lower and upper tail of the distribution. The methods are successfully exemplified on simulated data as well as wind speed and precipitation data.</p>


Sign in / Sign up

Export Citation Format

Share Document