Weighted Scoring Rules and Convex Risk Measures

2022 ◽  
Author(s):  
Zachary J. Smith ◽  
J. Eric Bickel

In Weighted Scoring Rules and Convex Risk Measures, Dr. Zachary J. Smith and Prof. J. Eric Bickel (both at the University of Texas at Austin) present a general connection between weighted proper scoring rules and investment decisions involving the minimization of a convex risk measure. Weighted scoring rules are quantitative tools for evaluating the accuracy of probabilistic forecasts relative to a baseline distribution. In their paper, the authors demonstrate that the relationship between convex risk measures and weighted scoring rules relates closely with previous economic characterizations of weighted scores based on expected utility maximization. As illustrative examples, the authors study two families of weighted scoring rules based on phi-divergences (generalizations of the Weighted Power and Weighted Pseudospherical Scoring rules) along with their corresponding risk measures. The paper will be of particular interest to the decision analysis and mathematical finance communities as well as those interested in the elicitation and evaluation of subjective probabilistic forecasts.

2015 ◽  
Author(s):  
Γεώργιος Παπαγιάννης

The main aim of the present thesis is to investigate the effect of diverging priors concerning model uncertainty on decision making. One of the main issues in the thesis is to assess the effect of different notions of distance in the space of probability measures and their use as loss functionals in the process of identifying the best suited model among a set of plausible priors. Another issue, is that of addressing the problem of ``inhomogeneous" sets of priors, i.e. sets of priors that highly divergent opinions may occur, and the need to robustly treat that case. As high degrees of inhomogeneity may lead to distrust of the decision maker to the priors it may be desirable to adopt a particular prior corresponding to the set which somehow minimizes the ``variability" among the models on the set. This leads to the notion of Frechet risk measure. Finally, an important problem is the actual calculation of robust risk measures. An account of their variational definition, the problem of calculation leads to the numerical treatment of problems of the calculus of variations for which reliable and effective algorithms are proposed. The contributions of the thesis are presented in the following three chapters. In Chapter 2, a statistical learning scheme is introduced for constructing the best model compatible with a set of priors provided by different information sources of varying reliability. As various priors may model well different aspects of the phenomenon the proposed scheme is a variational scheme based on the minimization of a weighted loss function in the space of probability measures which in certain cases is shown to be equivalent to weighted quantile averaging schemes. Therefore in contrast to approaches such as minimax decision theory in which a particular element of the prior set is chosen we construct for each prior set a probability measure which is not necessarily an element of it, a fact that as shown may lead to better description of the phenomenon in question. While treating this problem we also address the issue of the effect of the choice of distance functional in the space of measures on the problem of model selection. One of the key findings in this respect is that the class of Wasserstein distances seems to have the best performance as compared to other distances such as the KL-divergence. In Chapter 3, motivated by the results of Chapter 2, we treat the problem of specifying the risk measure for a particular loss when a set of highly divergent priors concerning the distribution of the loss is available. Starting from the principle that the ``variability" of opinions is not welcome, a fact for which a strong axiomatic framework is provided (see e.g. Klibanoff (2005) and references therein) we introduce the concept of Frechet risk measures, which corresponds to a minimal variance risk measure. Here we view a set of priors as a discrete measure on the space of probability measures and by variance we mean the variance of this discrete probability measure. This requires the use of the concept of Frechet mean. By different metrizations of the space of probability measures we define a variety of Frechet risk measures, the Wasserstein, the Hellinger and the weighted entropic risk measure, and illustrate their use and performance via an example related to the static hedging of derivatives under model uncertainty. In Chapter 4, we consider the problem of numerical calculation of convex risk measures applying techniques from the calculus of variations. Regularization schemes are proposed and the theoretical convergence of the algorithms is considered.


2019 ◽  
Vol 22 (03) ◽  
pp. 1950004 ◽  
Author(s):  
YANHONG CHEN ◽  
YIJUN HU

In this paper, we investigate representation results for set-valued law invariant coherent and convex risk measures, which can be considered as a set-valued extension of the multivariate scalar law invariant coherent and convex risk measures studied in the literature. We further introduce a new class of set-valued risk measures, named set-valued distortion risk measures, which can be considered as a set-valued version of multivariate scalar distortion risk measures introduced in the literature. The relationship between set-valued distortion risk measures and set-valued weighted value at risk is also given.


2020 ◽  
Vol 9 (2) ◽  
pp. 135 ◽  
Author(s):  
Junfeng Jiao ◽  
Shunhua Bai

This paper investigated the travel patterns of 1.7 million shared E-scooter trips from April 2018 to February 2019 in Austin, TX. There were more than 6000 active E-scooters in operation each month, generating over 150,000 trips and covered approximately 117,000 miles. During this period, the average travel distance and operation time of E-scooter trips were 0.77 miles and 7.55 min, respectively. We further identified two E-scooter usage hotspots in the city (Downtown Austin and the University of Texas campus). The spatial analysis showed that more trips originated from Downtown Austin than were completed, while the opposite was true for the UT campus. We also investigated the relationship between the number of E-scooter trips and the surrounding environments. The results show that areas with higher population density and more residents with higher education were correlated with more E-scooter trips. A shorter distance to the city center, the presence of transit stations, better street connectivity, and more compact land use were also associated with increased E scooter usage in Austin, TX. Surprisingly, the proportion of young residents within a neighborhood was negatively correlated with E-scooter usage.


1947 ◽  
Vol 13 (2) ◽  
pp. 97-109 ◽  
Author(s):  
J. Charles Kelley

The importance of the Clear Fork Focus as a pre-pottery archaeological complex of north-central Texas has become generally known to archaeologists through the industry of its discoverer and principal proponent, Dr. Cyrus N. Ray, of Abilene, Texas. Unfortunately, the relationship of this complex to other and comparable archaeological cultures of Texas has been largely neglected and some regrettable misinformation in regard to its chronological position has been widely disseminated. In this paper the cultural affiliations and age of the Clear Fork Focus will be discussed in terms of the evidence presented by its discoverers and from the standpoint of new data derived from large scale excavations completed by the University of Texas in the terraces of the Colorado River near Austin, Texas. Additional information obtained by the writer through study of some twelve thousand projectile points from central, south, and western Texas, and their geographic and temporal distribution also is used.


2016 ◽  
Vol 4 (1) ◽  
Author(s):  
Silvana M. Pesenti ◽  
Pietro Millossovich ◽  
Andreas Tsanakas

AbstractOne of risk measures’ key purposes is to consistently rank and distinguish between different risk profiles. From a practical perspective, a risk measure should also be robust, that is, insensitive to small perturbations in input assumptions. It is known in the literature [14, 39], that strong assumptions on the risk measure’s ability to distinguish between risks may lead to a lack of robustness. We address the trade-off between robustness and consistent risk ranking by specifying the regions in the space of distribution functions, where law-invariant convex risk measures are indeed robust. Examples include the set of random variables with bounded second moment and those that are less volatile (in convex order) than random variables in a given uniformly integrable set. Typically, a risk measure is evaluated on the output of an aggregation function defined on a set of random input vectors. Extending the definition of robustness to this setting, we find that law-invariant convex risk measures are robust for any aggregation function that satisfies a linear growth condition in the tail, provided that the set of possible marginals is uniformly integrable. Thus, we obtain that all law-invariant convex risk measures possess the aggregation-robustness property introduced by [26] and further studied by [40]. This is in contrast to the widely-used, non-convex, risk measure Value-at-Risk, whose robustness in a risk aggregation context requires restricting the possible dependence structures of the input vectors.


2011 ◽  
Vol 14 (01) ◽  
pp. 163-185 ◽  
Author(s):  
MARCO FRITTELLI ◽  
EMANUELA ROSAZZA GIANIN

We discuss two issues about risk measures: we first point out an alternative interpretation of the penalty function in the dual representation of a risk measure; then we analyze the continuity properties of comonotone convex risk measures. In particular, due to the loss of convexity, local and global continuity are no more equivalent and many implications true for convex risk measures do not hold any more.


2008 ◽  
Vol 11 (01) ◽  
pp. 19-54 ◽  
Author(s):  
SVETLOZAR RACHEV ◽  
SERGIO ORTOBELLI ◽  
STOYAN STOYANOV ◽  
FRANK J. FABOZZI ◽  
ALMIRA BIGLOVA

This paper examines the properties that a risk measure should satisfy in order to characterize an investor's preferences. In particular, we propose some intuitive and realistic examples that describe several desirable features of an ideal risk measure. This analysis is the first step in understanding how to classify an investor's risk. Risk is an asymmetric, relative, heteroskedastic, multidimensional concept that has to take into account asymptotic behavior of returns, inter-temporal dependence, risk-time aggregation, and the impact of several economic phenomena that could influence an investor's preferences. In order to consider the financial impact of the several aspects of risk, we propose and analyze the relationship between distributional modeling and risk measures. Similar to the notion of ideal probability metric to a given approximation problem, we are in the search for an ideal risk measure or ideal performance ratio for a portfolio selection problem. We then emphasize the parallels between risk measures and probability metrics, underlying the computational advantage and disadvantage of different approaches.


Mathematics ◽  
2018 ◽  
Vol 6 (10) ◽  
pp. 186 ◽  
Author(s):  
Chang Cong ◽  
Peibiao Zhao

Monetary risk measures are interpreted as the smallest amount of external cash that must be added to a financial position to make the position acceptable. In this paper, A new concept: non-cash risk measure is proposed and this measure provides an approach to transform the unacceptable positions into the acceptable positions in a nonconvex set. Non-cash risk measure uses not only cash but also other kinds of assets to adjust the position. This risk measure is nonconvex due to the use of optimization problem in L 1 norm. A convex extension of the nonconvex risk measure is derived and the relationship between the convex extension and the non-cash risk measure is detailed.


Author(s):  
Mark Walters

In this article, I discuss an unusual Caddo bottle in the Walters Collection. This vessel came from either Smith or Wood counties, Texas. The design on the bottle appears to depict a deer body with a human head. My purpose is to look at the vessel in more depth, explore the relationship between Caddo people and deer, and make information about the vessel available to the public. Plans are in place to curate this vessel at the Texas Archeological Research Laboratory at The University of Texas at Austin.


Sign in / Sign up

Export Citation Format

Share Document