scholarly journals TEMPERED PARETO-TYPE MODELLING USING WEIBULL DISTRIBUTIONS

2021 ◽  
pp. 1-30
Author(s):  
Hansjörg Albrecher ◽  
José Carlos Araujo-Acuna ◽  
Jan Beirlant

Abstract In various applications of heavy-tail modelling, the assumed Pareto behaviour is tempered ultimately in the range of the largest data. In insurance applications, claim payments are influenced by claim management and claims may, for instance, be subject to a higher level of inspection at highest damage levels leading to weaker tails than apparent from modal claims. Generalizing earlier results of Meerschaert et al. (2012) and Raschke (2020), in this paper we consider tempering of a Pareto-type distribution with a general Weibull distribution in a peaks-over-threshold approach. This requires to modulate the tempering parameters as a function of the chosen threshold. Modelling such a tempering effect is important in order to avoid overestimation of risk measures such as the value-at-risk at high quantiles. We use a pseudo maximum likelihood approach to estimate the model parameters and consider the estimation of extreme quantiles. We derive basic asymptotic results for the estimators, give illustrations with simulation experiments and apply the developed techniques to fire and liability insurance data, providing insight into the relevance of the tempering component in heavy-tail modelling.

2018 ◽  
Vol 23 (2) ◽  
Author(s):  
Jennifer So Kuen Chan ◽  
Kok-Haur Ng ◽  
Thanakorn Nitithumbundit ◽  
Shelton Peiris

Abstract Risk measures such as value-at-risk (VaR) and expected shortfall (ES) may require the calculation of quantile functions from quantile regression models. In a parametric set-up, we propose to regress directly on the quantiles of a distribution and demonstrate a method through the conditional autoregressive range model which has increasing popularity in recent years. Two flexible distribution families: the generalised beta type two on positive support and the generalised-t on real support (which requires log transformation) are adopted for the range data. Then the models are extended to allow the volatility dynamic and compared in terms of goodness-of-fit. The models are implemented using the module fmincon in Matlab under the classical likelihood approach and applied to analyse the intra-day high-low price ranges from the All Ordinaries index for the Australian stock market. Quantiles and upper-tail conditional expectations evaluated via VaR and ES respectively are forecast using the proposed models.


2020 ◽  
Vol 2020 ◽  
pp. 1-20
Author(s):  
Zubair Ahmad ◽  
Eisa Mahmoudi ◽  
Omid Kharazmi

Heavy-tailed distributions play an important role in modeling data in actuarial and financial sciences. In this article, a new method is suggested to define new distributions suitable for modeling data with a heavy right tail. The proposed method may be named as the Z-family of distributions. For illustrative purposes, a special submodel of the proposed family, called the Z-Weibull distribution, is considered in detail to model data with a heavy right tail. The method of maximum likelihood estimation is adopted to estimate the model parameters. A brief Monte Carlo simulation study for evaluating the maximum likelihood estimators is done. Furthermore, some actuarial measures such as value at risk and tail value at risk are calculated. A simulation study based on these actuarial measures is also done. An application of the Z-Weibull model to the earthquake insurance data is presented. Based on the analyses, we observed that the proposed distribution can be used quite effectively in modeling heavy-tailed data in insurance sciences and other related fields. Finally, Bayesian analysis and performance of Gibbs sampling for the earthquake data have also been carried out.


2021 ◽  
Vol 9 (4) ◽  
pp. 910-941
Author(s):  
Abd-Elmonem A. M. Teamah ◽  
Ahmed A. Elbanna ◽  
Ahmed M. Gemeay

Heavy tailed distributions have a big role in studying risk data sets. Statisticians in many cases search and try to find new or relatively new statistical models to fit data sets in different fields. This article introduced a relatively new heavy-tailed statistical model by using alpha power transformation and exponentiated log-logistic distribution which called alpha power exponentiated log-logistic distribution. Its statistical properties were derived mathematically such as moments, moment generating function, quantile function, entropy, inequality curves and order statistics. Five estimation methods were introduced mathematically and the behaviour of the proposed model parameters was checked by randomly generated data sets and these estimation methods. Also, some actuarial measures were deduced mathematically such as value at risk, tail value at risk, tail variance and tail variance premium. Numerical values for these measures were performed and proved that the proposed distribution has a heavier tail than others compared models. Finally, three real data sets from different fields were used to show how these proposed models fitting these data sets than other many wells known and related models.


2014 ◽  
Vol 31 (8) ◽  
pp. 1720-1731 ◽  
Author(s):  
Chien-Feng Huang ◽  
Tsung-Nan Hsieh ◽  
Bao Rong Chang ◽  
Chih-Hsiang Chang

Purpose – Stock selection has long been identified as a challenging task. This line of research is highly contingent upon reliable stock ranking for successful portfolio construction. The purpose of this paper is to employ the methods from computational intelligence (CI) to solve this problem more effectively. Design/methodology/approach – The authors develop a risk-adjusted strategy to improve upon the previous stock selection models by two main risk measures – downside risk and variation in returns. Moreover, the authors employ the genetic algorithm for optimization of model parameters and selection for input variables simultaneously. Findings – It is found that the proposed risk-adjusted methodology via maximum drawdown significantly outperforms the benchmark and improves the previous model in the performance of stock selection. Research limitations/implications – Future work considers an extensive study for the risk-adjusted model using other risk measures such as Value at Risk, Block Maxima, etc. The authors also intend to use financial data from other countries, if available, in order to assess if the method is generally applicable and robust across different environments. Practical implications – The authors expect this risk-adjusted model to advance the CI research for financial engineering and provide an promising solutions to stock selection in practice. Originality/value – The originality of this work is that maximum drawdown is being successfully incorporated into the CI-based stock selection model in which the model's effectiveness is validated with strong statistical evidence.


2019 ◽  
Author(s):  
Joseph John Pyne Simons ◽  
Ilya Farber

Not all transit users have the same preferences when making route decisions. Understanding the factors driving this heterogeneity enables better tailoring of policies, interventions, and messaging. However, existing methods for assessing these factors require extensive data collection. Here we present an alternative approach - an easily-administered single item measure of overall preference for speed versus comfort. Scores on the self-report item predict decisions in a choice task and account for a proportion of the differences in model parameters between people (n=298). This single item can easily be included on existing travel surveys, and provides an efficient method to both anticipate the choices of users and gain more general insight into their preferences.


Author(s):  
Sheri Markose ◽  
Simone Giansante ◽  
Nicolas A. Eterovic ◽  
Mateusz Gatkowski

AbstractWe analyse systemic risk in the core global banking system using a new network-based spectral eigen-pair method, which treats network failure as a dynamical system stability problem. This is compared with market price-based Systemic Risk Indexes, viz. Marginal Expected Shortfall, Delta Conditional Value-at-Risk, and Conditional Capital Shortfall Measure of Systemic Risk in a cross-border setting. Unlike paradoxical market price based risk measures, which underestimate risk during periods of asset price booms, the eigen-pair method based on bilateral balance sheet data gives early-warning of instability in terms of the tipping point that is analogous to the R number in epidemic models. For this regulatory capital thresholds are used. Furthermore, network centrality measures identify systemically important and vulnerable banking systems. Market price-based SRIs are contemporaneous with the crisis and they are found to covary with risk measures like VaR and betas.


2021 ◽  
Vol 14 (5) ◽  
pp. 201
Author(s):  
Yuan Hu ◽  
W. Brent Lindquist ◽  
Svetlozar T. Rachev

This paper investigates performance attribution measures as a basis for constraining portfolio optimization. We employ optimizations that minimize conditional value-at-risk and investigate two performance attributes, asset allocation (AA) and the selection effect (SE), as constraints on asset weights. The test portfolio consists of stocks from the Dow Jones Industrial Average index. Values for the performance attributes are established relative to two benchmarks, equi-weighted and price-weighted portfolios of the same stocks. Performance of the optimized portfolios is judged using comparisons of cumulative price and the risk-measures: maximum drawdown, Sharpe ratio, Sortino–Satchell ratio and Rachev ratio. The results suggest that achieving SE performance thresholds requires larger turnover values than that required for achieving comparable AA thresholds. The results also suggest a positive role in price and risk-measure performance for the imposition of constraints on AA and SE.


2009 ◽  
Vol 39 (2) ◽  
pp. 591-613 ◽  
Author(s):  
Andreas Kull

AbstractWe revisit the relative retention problem originally introduced by de Finetti using concepts recently developed in risk theory and quantitative risk management. Instead of using the Variance as a risk measure we consider the Expected Shortfall (Tail-Value-at-Risk) and include capital costs and take constraints on risk capital into account. Starting from a risk-based capital allocation, the paper presents an optimization scheme for sharing risk in a multi-risk class environment. Risk sharing takes place between two portfolios and the pricing of risktransfer reflects both portfolio structures. This allows us to shed more light on the question of how optimal risk sharing is characterized in a situation where risk transfer takes place between parties employing similar risk and performance measures. Recent developments in the regulatory domain (‘risk-based supervision’) pushing for common, insurance industry-wide risk measures underline the importance of this question. The paper includes a simple non-life insurance example illustrating optimal risk transfer in terms of retentions of common reinsurance structures.


2021 ◽  
Vol 14 (11) ◽  
pp. 540
Author(s):  
Eyden Samunderu ◽  
Yvonne T. Murahwa

Developments in the world of finance have led the authors to assess the adequacy of using the normal distribution assumptions alone in measuring risk. Cushioning against risk has always created a plethora of complexities and challenges; hence, this paper attempts to analyse statistical properties of various risk measures in a not normal distribution and provide a financial blueprint on how to manage risk. It is assumed that using old assumptions of normality alone in a distribution is not as accurate, which has led to the use of models that do not give accurate risk measures. Our empirical design of study firstly examined an overview of the use of returns in measuring risk and an assessment of the current financial environment. As an alternative to conventional measures, our paper employs a mosaic of risk techniques in order to ascertain the fact that there is no one universal risk measure. The next step involved looking at the current risk proxy measures adopted, such as the Gaussian-based, value at risk (VaR) measure. Furthermore, the authors analysed multiple alternative approaches that do not take into account the normality assumption, such as other variations of VaR, as well as econometric models that can be used in risk measurement and forecasting. Value at risk (VaR) is a widely used measure of financial risk, which provides a way of quantifying and managing the risk of a portfolio. Arguably, VaR represents the most important tool for evaluating market risk as one of the several threats to the global financial system. Upon carrying out an extensive literature review, a data set was applied which was composed of three main asset classes: bonds, equities and hedge funds. The first part was to determine to what extent returns are not normally distributed. After testing the hypothesis, it was found that the majority of returns are not normally distributed but instead exhibit skewness and kurtosis greater or less than three. The study then applied various VaR methods to measure risk in order to determine the most efficient ones. Different timelines were used to carry out stressed value at risks, and it was seen that during periods of crisis, the volatility of asset returns was higher. The other steps that followed examined the relationship of the variables, correlation tests and time series analysis conducted and led to the forecasting of the returns. It was noted that these methods could not be used in isolation. We adopted the use of a mosaic of all the methods from the VaR measures, which included studying the behaviour and relation of assets with each other. Furthermore, we also examined the environment as a whole, then applied forecasting models to accurately value returns; this gave a much more accurate and relevant risk measure as compared to the initial assumption of normality.


Sign in / Sign up

Export Citation Format

Share Document