THE PROPER USE OF RISK MEASURES IN PORTFOLIO THEORY

2005 ◽  
Vol 08 (08) ◽  
pp. 1107-1133 ◽  
Author(s):  
SERGIO ORTOBELLI ◽  
SVETLOZAR T. RACHEV ◽  
STOYAN STOYANOV ◽  
FRANK J. FABOZZI ◽  
ALMIRA BIGLOVA

This paper discusses and analyzes risk measure properties in order to understand how a risk measure has to be used to optimize the investor's portfolio choices. In particular, we distinguish between two admissible classes of risk measures proposed in the portfolio literature: safety-risk measures and dispersion measures. We study and describe how the risk could depend on other distributional parameters. Then, we examine and discuss the differences between statistical parametric models and linear fund separation ones. Finally, we propose an empirical comparison among three different portfolio choice models which depend on the mean, on a risk measure, and on a skewness parameter. Thus, we assess and value the impact on the investor's preferences of three different risk measures even considering some derivative assets among the possible choices.

2016 ◽  
Vol 19 (05) ◽  
pp. 1650035 ◽  
Author(s):  
FABIO CACCIOLI ◽  
IMRE KONDOR ◽  
MATTEO MARSILI ◽  
SUSANNE STILL

We show that including a term which accounts for finite liquidity in portfolio optimization naturally mitigates the instabilities that arise in the estimation of coherent risk measures on finite samples. This is because taking into account the impact of trading in the market is mathematically equivalent to introducing a regularization on the risk measure. We show here that the impact function determines which regularizer is to be used. We also show that any regularizer based on the norm [Formula: see text] with [Formula: see text] makes the sensitivity of coherent risk measures to estimation error disappear, while regularizers with [Formula: see text] do not. The [Formula: see text] norm represents a border case: its “soft” implementation does not remove the instability, but rather shifts its locus, whereas its “hard” implementation (including hard limits or a ban on short selling) eliminates it. We demonstrate these effects on the important special case of expected shortfall (ES) which has recently become the global regulatory market risk measure.


2008 ◽  
Vol 11 (01) ◽  
pp. 19-54 ◽  
Author(s):  
SVETLOZAR RACHEV ◽  
SERGIO ORTOBELLI ◽  
STOYAN STOYANOV ◽  
FRANK J. FABOZZI ◽  
ALMIRA BIGLOVA

This paper examines the properties that a risk measure should satisfy in order to characterize an investor's preferences. In particular, we propose some intuitive and realistic examples that describe several desirable features of an ideal risk measure. This analysis is the first step in understanding how to classify an investor's risk. Risk is an asymmetric, relative, heteroskedastic, multidimensional concept that has to take into account asymptotic behavior of returns, inter-temporal dependence, risk-time aggregation, and the impact of several economic phenomena that could influence an investor's preferences. In order to consider the financial impact of the several aspects of risk, we propose and analyze the relationship between distributional modeling and risk measures. Similar to the notion of ideal probability metric to a given approximation problem, we are in the search for an ideal risk measure or ideal performance ratio for a portfolio selection problem. We then emphasize the parallels between risk measures and probability metrics, underlying the computational advantage and disadvantage of different approaches.


2003 ◽  
Vol 33 (2) ◽  
pp. 173-191 ◽  
Author(s):  
Marc J. Goovaerts ◽  
Rob Kaas ◽  
Jan Dhaene ◽  
Qihe Tang

The paper derives many existing risk measures and premium principles by minimizing a Markov bound for the tail probability. Our approach involves two exogenous functions v(S) and φ(S, π) and another exogenous parameter α ≤ 1. Minimizing a general Markov bound leads to the following unifying equation: E [φ (S, π)] = αE [v (S)].For any random variable, the risk measure π is the solution to the unifying equation. By varying the functions φ and v, the paper derives the mean value principle, the zero-utility premium principle, the Swiss premium principle, Tail VaR, Yaari's dual theory of risk, mixture of Esscher principles and more. The paper also discusses combining two risks with super-additive properties and sub-additive properties. In addition, we recall some of the important characterization theorems of these risk measures.


2018 ◽  
Vol 2018 ◽  
pp. 1-7
Author(s):  
Xia Zhao ◽  
Hongyan Ji ◽  
Yu Shi

This paper introduces spectral risk measure (SRM) into optimization problem of insurance investment. Spectral risk measure could describe the degree of risk aversion, so the underlying strategy might take the investor's risk attitude into account. We establish an optimization model aiming at maximizing risk-adjusted return of capital (RAROC) involved with spectral risk measure. The theoretical result is derived and empirical study is displayed under different risk measures and different confidence levels comparatively. The result shows that risk attitude has a significant impact on investment strategy. With the increase of risk aversion factor, the investment ratio of risk asset correspondingly reduces. When the aversive level increases to a certain extent, the impact on investment strategies disappears because of the marginal effect of risk aversion. In the case of VaR and CVaR without regard for risk aversion, the investment ratio of risk asset is increasing significantly.


Author(s):  
Daryl Bandstra ◽  
Corey Gorrill

The risk of pipeline failure is a measure of the state of knowledge of the pipeline; improved knowledge of the pipeline reduces the uncertainty and therefore can reduce the associated risk. Specifically for corrosion defects, the knowledge of the number and size of defects is often obtained using in-line inspection tools which have uncertainty associated with their measurement capabilities. Quantitative Risk Assessment (QRA) is a methodology that objectively assesses a range of pipeline integrity threats including the threat of corrosion failure. QRA can incorporate the impact of significant sources of analysis uncertainty, such as feature sizing in risk estimates. This paper discusses an application of QRA used to evaluate the operating risk of high pressure transmission pipeline segments in the TransGas system. Specific examples are described in which the inspection tool sizing uncertainty was shown to exert a significant influence on the calculated risk levels. In carrying out the analysis, the failure probability models selected were dependent on the nature of the integrity threat and the type of information available for each pipeline. For the assessment of corrosion integrity, the results of in-line inspections were used directly in determining failure likelihood. For the other threats including equipment impact, geotechnical hazards, manufacturing cracks and stress corrosion cracking, the probability of failure was estimated from historical failure rates with adjustments to reflect line-specific conditions. Failure consequences were estimated using models that quantify the safety implications of loss of containment events. Using these models, safety risk measures were calculated along the length of each pipeline. The results of the analysis show the benefit of the use of inspection technologies with improved sizing accuracy, in terms of reduction in expected operating risk.


2003 ◽  
Vol 33 (02) ◽  
pp. 173-191 ◽  
Author(s):  
Marc J. Goovaerts ◽  
Rob Kaas ◽  
Jan Dhaene ◽  
Qihe Tang

The paper derives many existing risk measures and premium principles by minimizing a Markov bound for the tail probability. Our approach involves two exogenous functions v(S) and φ(S, π) and another exogenous parameter α ≤ 1. Minimizing a general Markov bound leads to the following unifying equation: E [φ (S, π)] = αE [v (S)]. For any random variable, the risk measure π is the solution to the unifying equation. By varying the functions φ and v, the paper derives the mean value principle, the zero-utility premium principle, the Swiss premium principle, Tail VaR, Yaari's dual theory of risk, mixture of Esscher principles and more. The paper also discusses combining two risks with super-additive properties and sub-additive properties. In addition, we recall some of the important characterization theorems of these risk measures.


2008 ◽  
Vol 14 (4) ◽  
pp. 514-521 ◽  
Author(s):  
F Martinelli Boneschi ◽  
B Colombo ◽  
P Annovazzi ◽  
V Martinelli ◽  
L Bernasconi ◽  
...  

The aim of the present study is to assess the actual and lifetime frequency of neuropathic (trigeminal neuralgia, Lhermitte’s sign, dysesthesic pain) and somatic (painful muscle spasms and low back pain) pain and headache (tensive headache and migraine) in a cross-sectional sample of 428 consecutive multiple sclerosis (MS) outpatients followed-up in an Italian University MS center over a 3-month period. The impact of demographic and disease-related variables on pain and headache risk is also studied. A semi-structured questionnaire was administered during a face-to-face interview with MS patients and a multivariate logistic regression model is applied to obtain crude and adjusted risk measures. The mean age of the sample was 38.4 years, and female/male ratio was 1.65. The mean disease duration was 9.6 years and the median Expanded Disability Status Scale was 2.0, with most of the patients (74.8%) being affected by the relapsing–remitting form. Lifetime prevalence at the date of examination of at least one type of neuropathic or somatic pain was 39.8% in MS patients, with 58.5% also including headache, while the actual prevalence was 23.8% and 39.9%, respectively. After multivariate analysis, a progressive course of disease was shown to increase the risk of dysesthesic pain and painful muscle spasms, while greater disability was responsible for a higher risk of back pain. L’Hermitte’s sign was more frequent in younger patients, while females had a higher risk of headache. Pain and headache in MS are not negligible symptoms and a neurological examination should not miss the assessment of risk factors for specific types of pain for a more specific and individualized treatment.


2007 ◽  
Vol 10 (07) ◽  
pp. 1137-1157 ◽  
Author(s):  
NICOLE BRANGER ◽  
CHRISTIAN SCHLAG

This paper deals with the problem of determining the correct risk measure for options in a Black–Scholes (BS) framework when time is discrete. For the purposes of hedging or testing simple asset pricing relationships previous papers used the "local", i.e., the continuous-time, BS beta as the measure of option risk even over discrete time intervals. We derive a closed-form solution for option betas over discrete return periods where we distinguish between "covariance betas" and "asset pricing betas". Both types of betas involve only simple Black–Scholes option prices and are thus easy to compute. However, the theoretical properties of these discrete betas are fundamentally different from those of local betas. We also analyze the impact of the return interval on two performance measures, the Sharpe ratio and the Treynor measure. The dependence of both measures on the return interval is economically significant, especially for OTM options.


1997 ◽  
Vol 161 ◽  
pp. 197-201 ◽  
Author(s):  
Duncan Steel

AbstractWhilst lithopanspermia depends upon massive impacts occurring at a speed above some limit, the intact delivery of organic chemicals or other volatiles to a planet requires the impact speed to be below some other limit such that a significant fraction of that material escapes destruction. Thus the two opposite ends of the impact speed distributions are the regions of interest in the bioastronomical context, whereas much modelling work on impacts delivers, or makes use of, only the mean speed. Here the probability distributions of impact speeds upon Mars are calculated for (i) the orbital distribution of known asteroids; and (ii) the expected distribution of near-parabolic cometary orbits. It is found that cometary impacts are far more likely to eject rocks from Mars (over 99 percent of the cometary impacts are at speeds above 20 km/sec, but at most 5 percent of the asteroidal impacts); paradoxically, the objects impacting at speeds low enough to make organic/volatile survival possible (the asteroids) are those which are depleted in such species.


Author(s):  
Julie L. Wambaugh ◽  
Lydia Kallhoff ◽  
Christina Nessler

Purpose This study was designed to examine the association of dosage and effects of Sound Production Treatment (SPT) for acquired apraxia of speech. Method Treatment logs and probe data from 20 speakers with apraxia of speech and aphasia were submitted to a retrospective analysis. The number of treatment sessions and teaching episodes was examined relative to (a) change in articulation accuracy above baseline performance, (b) mastery of production, and (c) maintenance. The impact of practice schedule (SPT-Blocked vs. SPT-Random) was also examined. Results The average number of treatment sessions conducted prior to change was 5.4 for SPT-Blocked and 3.9 for SPT-Random. The mean number of teaching episodes preceding change was 334 for SPT-Blocked and 179 for SPT-Random. Mastery occurred within an average of 13.7 sessions (1,252 teaching episodes) and 12.4 sessions (1,082 teaching episodes) for SPT-Blocked and SPT-Random, respectively. Comparisons of dosage metric values across practice schedules did not reveal substantial differences. Significant negative correlations were found between follow-up probe performance and the dosage metrics. Conclusions Only a few treatment sessions were needed to achieve initial positive changes in articulation, with mastery occurring within 12–14 sessions for the majority of participants. Earlier occurrence of change or mastery was associated with better follow-up performance. Supplemental Material https://doi.org/10.23641/asha.12592190


Sign in / Sign up

Export Citation Format

Share Document