Modeling of Risk Measure Bonds Using the Beta Model

Author(s):  
Fatma Hachicha ◽  
Ahmed Hachicha ◽  
Afif Masmoudi

Duration and convexity are important measures in fixed-income portfolio management. In this paper, we analyze this measure of the bonds by applying the beta model. The general usefulness of the beta probability distribution enhances its applicability in a wide range of reliability analyses, especially in the theory and practice of reliability management. We estimate the beta density function of the duration/convexity. This estimate is based on two important and simple models of short rates, namely, Vasicek and CIR (Cox, Ingersoll, and Ross CIR). The models are described and then their sensitivity of the models with respect to changes in the parameters is studied. We generate the stochastic interest rate on the duration and convexity model. The main results show that the beta probability distribution can be applied to model each phase of the risk function. This distribution approved its effectiveness, simplicity and flexibility. In this paper, we are interested in providing a decision-making tool for the manager in order to minimize the portfolio risk. It is helpful to have a model that is reasonably simple and suitable to different maturity of bonds. Also, it is widely used by investors for choosing bond portfolio immunization through the investment strategy. The finding also shows that the probability of risk measured by the reliability function is to highlight the relationship between duration/convexity and different risk levels. With these new results, this paper offers several implications for investors and risk management purposes.

2014 ◽  
Vol 22 (4) ◽  
pp. 339-348 ◽  
Author(s):  
Lukasz Prorokowski ◽  
Hubert Prorokowski

Purpose – The purpose of this paper is to outline how banks are coping with the new regulatory challenges posed by stressed value at risk (SVaR). The Basel Committee has introduced three measures of capital charges for market risk: incremental risk charge (IRC), SVaR and comprehensive risk measure (CRM). This paper is designed to analyse the methodologies for SVaR deployed at different banks to highlight the SVaR-related challenges stemming from complying with Basel 2.5. This revised market risk framework comes into force in Europe in 2012. Among the wide range of changes is the requirement for banks to calculate SVaR at a 99 per cent confidence interval over a period of significant stress. Design/methodology/approach – The current research project is based on in-depth, semi-structured interviews with nine universal banks and one financial services company to explore the strides major banks are taking to implement SVaR methodologies while complying with Basel 2.5. Findings – This paper focuses on strengths and weaknesses of the SVaR approach while reviewing peer practices of implementing SVaR modelling. Interestingly, the surveyed banks have not indicated significant challenges associated with implementation of SVaR, and the reported problems boil down to dealing with the poor quality of market data and, as in cases of IRC and CRM, the lack of regulatory guidance. As far as peer practices of implementing SVaR modelling are concerned, the majority of the surveyed banks utilise historical simulations and apply both the absolute and relative measures of volatility for different risk factors. Originality/value – The academic studies that explicitly analyse challenges associated with implementing the stressed version of VaR are scarce. Filling in the gap in the existing academic literature, this paper aims to shed some explanatory light on the issues major banks are facing when calculating SVaR. In doing so, this study adequately bridges theory and practice by contributing to the fierce debate on compliance with Basel 2.5.


Author(s):  
F. Febrian

Oil and gas companies are facing an enormous challenge to create value from mature fields. Moreover, price volatility presents a massive impact on project uncertainties. Therefore, robust portfolio management is essential for oil and gas companies to manage critical challenges and uncertainties. The objective of this study is to develop a robust portfolio model to assist top management in oil and gas companies to drive investment strategy. PRIME (Pertamina Investment Management Engine) has been built to visualize advanced oil and gas project portfolio management. The engine observes the relationship between risk-and-return as the main framework drivers. The profitability index is endorsed as a parameter to envisage the investment effectiveness of individual projects. Correspondingly, the risk index is a manifestation of multi-variable analysis involving subsurface uncertainty and price. A nine clusters "tactical board" matrix is provided as the outcome of PRIME to define generic strategy & action plans. The PRIME analysis leads to a dual theme of perspective: both macro and micro-scale. The macro-scale discovers a diversification of strategy and scenario development to achieve long-term objectives. Whereas, micro-scale perspective generates a detailed action plan in a particular cluster as a representation of the short and mid-term corporate strategy. Several strategies and action plans have been recommended, including advanced technology implementation, new gas commercialization, additional incentives in the Production Sharing Contract, tax management renegotiation, and project portfolio rebalancing


2018 ◽  
Vol 8 (2) ◽  
pp. 49-73
Author(s):  
Petr Adamec

The core issue of this paper is a quality in the lifelong learning. The aim of the contribution is to describe the area, level and dimensions of quality in a wide range of lifelong learning programs, respectively of further education, which are realized in the sense of § 60 and 60a of the Higher Education Act. The content of the paper also focuses on the theoretical and practical starting points of the quality phenomenon, both from the historical point of view and especially from the perspective of the current focus and concept of university policy in the European and Czech region. The paper also presents the results of a survey focusing on approaches to the quality assurance systems in the concept of components at selected public university.


2019 ◽  
Vol 43 (3) ◽  
pp. 96-140 ◽  
Author(s):  
Dominic D.P. Johnson ◽  
Dominic Tierney

A major puzzle in international relations is why states privilege negative over positive information. States tend to inflate threats, exhibit loss aversion, and learn more from failures than from successes. Rationalist accounts fail to explain this phenomenon, because systematically overweighting bad over good may in fact undermine state interests. New research in psychology, however, offers an explanation. The “negativity bias” has emerged as a fundamental principle of the human mind, in which people's response to positive and negative information is asymmetric. Negative factors have greater effects than positive factors across a wide range of psychological phenomena, including cognition, motivation, emotion, information processing, decision-making, learning, and memory. Put simply, bad is stronger than good. Scholars have long pointed to the role of positive biases, such as overconfidence, in causing war, but negative biases are actually more pervasive and may represent a core explanation for patterns of conflict. Positive and negative dispositions apply in different contexts. People privilege negative information about the external environment and other actors, but positive information about themselves. The coexistence of biases can increase the potential for conflict. Decisionmakers simultaneously exaggerate the severity of threats and exhibit overconfidence about their capacity to deal with them. Overall, the negativity bias is a potent force in human judgment and decisionmaking, with important implications for international relations theory and practice.


Mathematics ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 111
Author(s):  
Hyungbin Park

This paper proposes modified mean-variance risk measures for long-term investment portfolios. Two types of portfolios are considered: constant proportion portfolios and increasing amount portfolios. They are widely used in finance for investing assets and developing derivative securities. We compare the long-term behavior of a conventional mean-variance risk measure and a modified one of the two types of portfolios, and we discuss the benefits of the modified measure. Subsequently, an optimal long-term investment strategy is derived. We show that the modified risk measure reflects the investor’s risk aversion on the optimal long-term investment strategy; however, the conventional one does not. Several factor models are discussed as concrete examples: the Black–Scholes model, Kim–Omberg model, Heston model, and 3/2 stochastic volatility model.


2019 ◽  
Vol 34 (2) ◽  
pp. 297-315
Author(s):  
Linxiao Wei ◽  
Yijun Hu

AbstractCapital allocation is of central importance in portfolio management and risk-based performance measurement. Capital allocations for univariate risk measures have been extensively studied in the finance literature. In contrast to this situation, few papers dealt with capital allocations for multivariate risk measures. In this paper, we propose an axiom system for capital allocation with multivariate risk measures. We first recall the class of the positively homogeneous and subadditive multivariate risk measures, and provide the corresponding representation results. Then it is shown that for a given positively homogeneous and subadditive multivariate risk measure, there exists a capital allocation principle. Furthermore, the uniqueness of the capital allocation principe is characterized. Finally, examples are also given to derive the explicit capital allocation principles for the multivariate risk measures based on mean and standard deviation, including the multivariate mean-standard-deviation risk measures.


M. Fabius Quintilianus was a prominent orator, declaimer, and teacher of eloquence in the first century ce. After his retirement he wrote the Institutio oratoria, a unique treatise in Antiquity because it is a handbook of rhetoric and an educational treatise in one. Quintilian’s fame and influence are not only based on the Institutio, but also on the two collections of Declamations which were attributed to him in late Antiquity. The Oxford Handbook of Quintilian aims to present Quintilian’s Institutio as a key treatise in the history of Graeco-Roman rhetoric and its influence on the theory and practice of rhetoric and education, from late Antiquity until the present day. It contains chapters on Quintilian’s educational programme, his concepts and classifications of rhetoric, his discussion of the five canons of rhetoric, his style, his views on literary criticism, declamation, and the relationship between rhetoric and law, and the importance of the visual and performing arts in his work. His huge legacy is presented in successive chapters devoted to Quintilian in late Antiquity, the Middle Ages, the Italian Renaissance, Northern Europe during the Renaissance, Europe from the Eighteenth to the Twentieth Century, and the United States of America. There are also chapters devoted to the biographical tradition, the history of printed editions, and modern assessments of Quintilian. The twenty-one authors of the chapters represent a wide range of expertise and scholarly traditions and thus offer a unique mixture of current approaches to Quintilian from a multidisciplinary perspective.


2018 ◽  
Vol 30 (12) ◽  
pp. 3227-3258 ◽  
Author(s):  
Ian H. Stevenson

Generalized linear models (GLMs) have a wide range of applications in systems neuroscience describing the encoding of stimulus and behavioral variables, as well as the dynamics of single neurons. However, in any given experiment, many variables that have an impact on neural activity are not observed or not modeled. Here we demonstrate, in both theory and practice, how these omitted variables can result in biased parameter estimates for the effects that are included. In three case studies, we estimate tuning functions for common experiments in motor cortex, hippocampus, and visual cortex. We find that including traditionally omitted variables changes estimates of the original parameters and that modulation originally attributed to one variable is reduced after new variables are included. In GLMs describing single-neuron dynamics, we then demonstrate how postspike history effects can also be biased by omitted variables. Here we find that omitted variable bias can lead to mistaken conclusions about the stability of single-neuron firing. Omitted variable bias can appear in any model with confounders—where omitted variables modulate neural activity and the effects of the omitted variables covary with the included effects. Understanding how and to what extent omitted variable bias affects parameter estimates is likely to be important for interpreting the parameters and predictions of many neural encoding models.


2018 ◽  
Vol 39 (8) ◽  
pp. 995-1009
Author(s):  
Todd C. Harris

PurposeThe purpose of this paper is twofold: first, to examine George Washington’s approach to leadership through the lens of contemporary leadership theory and practice; and second, to help modern managers further reflect upon and develop their own leadership capabilities through a historiographic examination of Washington’s leadership traits and skills.Design/methodology/approachCombining three different academic disciplines, management, psychology and history, the author utilized a historiographic and interdisciplinary research methodology, conducting a detailed exploration of the life of George Washington through an examination of a wide range of original archival materials, books, journal articles and other sources.FindingsThe present analysis reveals that Washington demonstrated a variety of well-validated leadership competencies (e.g. emotional intelligence, resilience, integrity, etc.) that are largely consistent with leader-centered theoretical conceptions of leadership.Originality/valueThis is the first historiographic study of George Washington’s approach to leadership within the management literature. Additionally, through the development of a competency model, the study demonstrates how Washington employed tools and techniques from a host of modern leadership theories to achieve critically important results.


2005 ◽  
Vol 4 (1-2) ◽  
pp. 27-32 ◽  
Author(s):  
Colin A Sharp

The use of Capability Maturity Models in financial management, project management, people management and information systems management in a wide variety of organisations indicates the potential for an Organisational Evaluation Capability Hierarchy to guide the self-diagnosis of organisations in building their evaluation maturity. This paper is about the theory behind this growing trend in organisational governance and organisational diagnosis, and explores its relevance to evaluation theory and practice. This theoretical analysis may have long-term practical benefits for evaluation practitioners, as is being developed in the fields of project management, financial management, and people management in a wide range of organisations.


Sign in / Sign up

Export Citation Format

Share Document