scholarly journals A refined measure of conditional maximum drawdown

2021 ◽  
Author(s):  
Damiano Rossello ◽  
Silvestro Lo Cascio

AbstractRisks associated to maximum drawdown have been recently formalized as the tail mean of the maximum drawdown distribution, called Conditional Expected Drawdown (CED). In fact, the special case of average maximum drawdown is widely used in the fund management industry also in association to performance management. It lacks relevant information on worst case scenarios over a fixed horizon. Formulating a refined version of CED, we are able to add this piece of information to the risk measurement of drawdown, and then get a risk measure for processes that preserves all the good properties of CED but following more prudential regulatory and management assessments, also in term of marginal risk contribution attributed to factors. As a special application, we consider the conditioning information given by the all time minimum of cumulative returns.

2015 ◽  
Vol 18 (01) ◽  
pp. 1550003
Author(s):  
Mohammad Reza Tavakoli Baghdadabad

Due to the numerous studies of asymmetric portfolio returns, asymmetric risk measures have widely been used in risk management with extensive uses on the methodology of n-degree lower partial moment (LPM). Unlike the initial studies, we use the risk measure of n-degree maximum drawdown, which is a special case of n-degree LPM, to investigate the reduction impacts of n-degree maximum drawdown risk on risk tolerances generated by management styles from US equity-based mutual funds. We found that skewness does not impose any significant problems on the model of n-degree maximum drawdown. Thus, the tolerance effect of maximum drawdown risk in the n-degree M-DRM models is a decrease in fund returns. The n-degree CM-DRM optimization model decreased investors' risk more than two conventional models. Thus, the M-DRM can be accommodated with risk-averse investors' approach. The efficient set of mean-variance choices from the investment opportunity set, as described by Markowitz, shows that the n-degree CM-DRM algorithms create this set with lower risk than other algorithms. It implies that the mean-variance opportunity set generated by the n-degree CM-DRM creates lower risk for a given return than covariance and CLPM.


2020 ◽  
Vol 34 (09) ◽  
pp. 13700-13703
Author(s):  
Nikhil Vyas ◽  
Ryan Williams

All known SAT-solving paradigms (backtracking, local search, and the polynomial method) only yield a 2n(1−1/O(k)) time algorithm for solving k-SAT in the worst case, where the big-O constant is independent of k. For this reason, it has been hypothesized that k-SAT cannot be solved in worst-case 2n(1−f(k)/k) time, for any unbounded ƒ : ℕ → ℕ. This hypothesis has been called the “Super-Strong Exponential Time Hypothesis” (Super Strong ETH), modeled after the ETH and the Strong ETH. We prove two results concerning the Super-Strong ETH:1. It has also been hypothesized that k-SAT is hard to solve for randomly chosen instances near the “critical threshold”, where the clause-to-variable ratio is 2k ln 2 −Θ(1). We give a randomized algorithm which refutes the Super-Strong ETH for the case of random k-SAT and planted k-SAT for any clause-to-variable ratio. In particular, given any random k-SAT instance F with n variables and m clauses, our algorithm decides satisfiability for F in 2n(1−Ω( log k)/k) time, with high probability (over the choice of the formula and the randomness of the algorithm). It turns out that a well-known algorithm from the literature on SAT algorithms does the job: the PPZ algorithm of Paturi, Pudlak, and Zane (1998).2. The Unique k-SAT problem is the special case where there is at most one satisfying assignment. It is natural to hypothesize that the worst-case (exponential-time) complexity of Unique k-SAT is substantially less than that of k-SAT. Improving prior reductions, we show the time complexities of Unique k-SAT and k-SAT are very tightly related: if Unique k-SAT is in 2n(1−f(k)/k) time for an unbounded f, then k-SAT is in 2n(1−f(k)(1−ɛ)/k) time for every ɛ > 0. Thus, refuting Super Strong ETH in the unique solution case would refute Super Strong ETH in general.


2014 ◽  
Vol 2014 ◽  
pp. 1-13
Author(s):  
Aifan Ling ◽  
Le Tang

Recently, active portfolio management problems are paid close attention by many researchers due to the explosion of fund industries. We consider a numerical study of a robust active portfolio selection model with downside risk and multiple weights constraints in this paper. We compare the numerical performance of solutions with the classical mean-variance tracking error model and the naive1/Nportfolio strategy by real market data from China market and other markets. We find from the numerical results that the tested active models are more attractive and robust than the compared models.


2011 ◽  
Vol 21 (01) ◽  
pp. 87-100
Author(s):  
GREG ALOUPIS ◽  
PROSENJIT BOSE ◽  
ERIK D. DEMAINE ◽  
STEFAN LANGERMAN ◽  
HENK MEIJER ◽  
...  

Given a planar polygon (or chain) with a list of edges {e1, e2, e3, …, en-1, en}, we examine the effect of several operations that permute this edge list, resulting in the formation of a new polygon. The main operations that we consider are: reversals which involve inverting the order of a sublist, transpositions which involve interchanging subchains (sublists), and edge-swaps which are a special case and involve interchanging two consecutive edges. When each edge of the given polygon has also been assigned a direction we say that the polygon is signed. In this case any edge involved in a reversal changes direction. We show that a star-shaped polygon can be convexified using O(n2) edge-swaps, while maintaining simplicity, and that this is tight in the worst case. We show that determining whether a signed polygon P can be transformed to one that has rotational or mirror symmetry with P, using transpositions, takes Θ(n log n) time. We prove that the problem of deciding whether transpositions can modify a polygon to fit inside a rectangle is weakly NP-complete. Finally we give an O(n log n) time algorithm to compute the maximum endpoint distance for an oriented chain.


2017 ◽  
Vol 59 ◽  
pp. 59-101 ◽  
Author(s):  
Tim Roughgarden ◽  
Vasilis Syrgkanis ◽  
Eva Tardos

This survey outlines a general and modular theory for proving approximation guarantees for equilibria of auctions in complex settings. This theory complements traditional economic techniques, which generally focus on exact and optimal solutions and are accordingly limited to relatively stylized settings. We highlight three user-friendly analytical tools: smoothness-type inequalities, which immediately yield approximation guarantees for many auction formats of interest in the special case of complete information and deterministic strategies; extension theorems, which extend such guarantees to randomized strategies, no-regret learning outcomes, and incomplete-information settings; and composition theorems, which extend such guarantees from simpler to more complex auctions. Combining these tools yields tight worst-case approximation guarantees for the equilibria of many widely-used auction formats.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Ziting Pei ◽  
Xuhui Wang ◽  
Xingye Yue

G-expected shortfall (G-ES), which is a new type of worst-case expected shortfall (ES), is defined as measuring risk under infinite distributions induced by volatility uncertainty. Compared with extant notions of the worst-case ES, the G-ES can be computed using an explicit formula with low computational cost. We also conduct backtests for the G-ES. The empirical analysis demonstrates that the G-ES is a reliable risk measure.


2012 ◽  
Vol 21 (3) ◽  
pp. 217-224 ◽  
Author(s):  
Zoltán Bokor

Transport companies are facing management problems of enhancing operation efficiency at limited resources. The decision-making procedures applicable to solve such problems can be made more reliable if relevant information on basic components of business or technology processes are available. This information base can be produced by using cost and performance management methods combining financial and technology system parameters. The paper aims at summarising the research results conducted in the field of developing cost and performance controlling tools using this approach for the case of different transport companies. After explaining the main modelling principles, the experiences of empirical pilot projects are discussed. The preliminary results of these projects have proved the significance of the elaborated methodology. At the same time, it can also be concluded that the modelling tool shall be adapted to the specific circumstances of the examined transport companies before practical implementations. KEY WORDS: cost calculation, performance management, controlling


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Le Tang ◽  
Aifan Ling

With the uncertainty probability distribution, we establish the worst-case CVaR (WCCVaR) risk measure and discuss a robust portfolio selection problem with WCCVaR constraint. The explicit solution, instead of numerical solution, is found and two-fund separation is proved. The comparison of efficient frontier with mean-variance model is discussed and finally we give numerical comparison with VaR model and equally weighted strategy. The numerical findings indicate that the proposed WCCVaR model has relatively smaller risk and greater return and relatively higher accumulative wealth than VaR model and equally weighted strategy.


1988 ◽  
Vol 45 (1) ◽  
pp. 47-59 ◽  
Author(s):  
Donald Ramos

The study of slave mortality and morbidity in Brazil has been very difficult because of the extreme paucity of sources. Techniques which have been useful in studying the lives of free men and women seldom are useful for analyzing their slaves. The use of parish records such as baptism and death registers is not possible because of the custom of listing only the slave's first name and the unimaginative choice of names which resulted in large numbers of Joãos, Josés, Manuels, Antônios, Antonias, Joanas, and, of course, Marias. Equally important, the types of plantation records available to students of U.S. slavery have seldom been found for Brazil.This essay is an examination of an isolated slave register, which, for a series of idiosyncratic reasons, provides information permitting a glimpse at mortality and morbidity in a distinct and carefully controlled slave population. Because the slaves involved were used in diamond mining under horrendous conditions it is probable that the conclusions reached in this essay represent a worst case scenario. Rather than typical, this is a special case where work and living conditions were probably worse than in plantation zones and certainly worse than in urban areas. It is this situation which makes the conclusions of this essay quite startling.


Sign in / Sign up

Export Citation Format

Share Document