Alternative Risk Premia Timing: A Point-in-Time Macro, Sentiment, Valuation Analysis

2021 ◽  
Vol 1 (1) ◽  
pp. 52-72
Author(s):  
Olivier Blin ◽  
Florian Ielpo ◽  
Joan Lee ◽  
Jérôme Teiletche

We investigate the question of dynamic allocation across a diversified range of alternative risk premia. By using a set of point-in-time indicators across macro, sentiment and valuation dimensions, we find that a majority of indicators deliver a positive information ratio for a majority of alternative risk premia over the period 2005–2020. In our empirical simulations, the macro dimension seems to have worked well, notably during recession periods. Sentiment (based on market stress and momentum) struggled during recovery periods, but added value elsewhere. Valuation has worked well from 2005 to 2013 and lost part of its appeal since then. The combination of indicators allows to deliver a higher information ratio thanks to the low correlation among them. Our research also finds that point-in-time macroeconomic variables (“nowcasters”) can add value over traditional indicators, while this improvement is not significant in the case of the market stress indicator.

2017 ◽  
Vol 37 (03) ◽  
pp. 275-286 ◽  
Author(s):  
José Ríos ◽  
Joaquín Saez-Peñataro ◽  
Caridad Pontes ◽  
Ferran Torres

AbstractRandomized clinical trials are the gold standard when experimental designs are feasible. Randomization allows the handling of allocation bias for known and unknown confounders. Specific tools such as blocking, stratification, and dynamic allocation provide additional guarantees to simple randomization. When an experimental design is not feasible, the propensity score (PS) has been shown to produce greater benefit than traditional methods (i.e., restriction, stratification, matching and adjusting). There appears to be a hierarchy in terms of the effectiveness of balancing for PS techniques: matching or weighting above stratification above covariate adjustment (which is discouraged due to its drawbacks). Instrumental variable analysis and its variants might provide added value because they aim to balance for unknown confounders as well, thus mimicking randomization, but at present, are considered more useful for sensitivity rather than primary analyses.


Author(s):  
Łukasz Markowski ◽  
Jakub Keller

<p>This article deals with the subject of volatility of financial markets in relation to the US stock market and its volatility index, i.e. the VIX index. The authors analyzed previous studies on the VIX index and based on them, defined a research gap that relates to the problem of market response to emerging macroeconomic information about the US economy. The vast majority of research on the VIX index relates to its forecasting based on mathematical models not taking into account current market data. The authors attempted to assess the impact of emerging macro data on the variability of the VIX index, thus illustrating the magnitude of the impact of individual variables on the so-called US Stock Exchange fear index. The study analysed 80 macroeconomic variables in the period from January 2009 to June 2019 in order to check which of them cause the greatest market volatility. The study was based on correlation study and econometric modeling. The obtained results allowed to formulate conclusions indicating the most important macroeconomic parameters that affect the perception of the market by investors through the pricing of options valuation on the S&amp;P 500 index. The authors managed to filter the most important variables for predicting the change of VIX level. In the eyes of the authors, the added value of the article is to indicate the relationship between macro variables and market volatility illustrated by the VIX index, which has not been explored in previous studies. The analyzes carried out are part of the research trend on market information efficiency and broaden knowledge in the area of capital investments.</p>


Author(s):  
B. Lencova ◽  
G. Wisselink

Recent progress in computer technology enables the calculation of lens fields and focal properties on commonly available computers such as IBM ATs. If we add to this the use of graphics, we greatly increase the applicability of design programs for electron lenses. Most programs for field computation are based on the finite element method (FEM). They are written in Fortran 77, so that they are easily transferred from PCs to larger machines.The design process has recently been made significantly more user friendly by adding input programs written in Turbo Pascal, which allows a flexible implementation of computer graphics. The input programs have not only menu driven input and modification of numerical data, but also graphics editing of the data. The input programs create files which are subsequently read by the Fortran programs. From the main menu of our magnetic lens design program, further options are chosen by using function keys or numbers. Some options (lens initialization and setting, fine mesh, current densities, etc.) open other menus where computation parameters can be set or numerical data can be entered with the help of a simple line editor. The "draw lens" option enables graphical editing of the mesh - see fig. I. The geometry of the electron lens is specified in terms of coordinates and indices of a coarse quadrilateral mesh. In this mesh, the fine mesh with smoothly changing step size is calculated by an automeshing procedure. The options shown in fig. 1 allow modification of the number of coarse mesh lines, change of coordinates of mesh points or lines, and specification of lens parts. Interactive and graphical modification of the fine mesh can be called from the fine mesh menu. Finally, the lens computation can be called. Our FEM program allows up to 8000 mesh points on an AT computer. Another menu allows the display of computed results stored in output files and graphical display of axial flux density, flux density in magnetic parts, and the flux lines in magnetic lenses - see fig. 2. A series of several lens excitations with user specified or default magnetization curves can be calculated and displayed in one session.


2015 ◽  
Vol 25 (1) ◽  
pp. 50-60
Author(s):  
Anu Subramanian

ASHA's focus on evidence-based practice (EBP) includes the family/stakeholder perspective as an important tenet in clinical decision making. The common factors model for treatment effectiveness postulates that clinician-client alliance positively impacts therapeutic outcomes and may be the most important factor for success. One strategy to improve alliance between a client and clinician is the use of outcome questionnaires. In the current study, eight parents of toddlers who attended therapy sessions at a university clinic responded to a session outcome questionnaire that included both rating scale and descriptive questions. Six graduate students completed a survey that included a question about the utility of the questionnaire. Results indicated that the descriptive questions added value and information compared to using only the rating scale. The students were varied in their responses regarding the effectiveness of the questionnaire to increase their comfort with parents. Information gathered from the questionnaire allowed for specific feedback to graduate students to change behaviors and created opportunities for general discussions regarding effective therapy techniques. In addition, the responses generated conversations between the client and clinician focused on clients' concerns. Involving the stakeholder in identifying both effective and ineffective aspects of therapy has advantages for clinical practice and education.


2003 ◽  
Author(s):  
John L. Caccavale
Keyword(s):  

2011 ◽  
pp. 39-50
Author(s):  
V. Lushin

The author analyzes factors that led to a deeper fall in output and profitability in the real sector of the Russian economy in comparison with other segments during the acute phase of the financial crisis. It is argued that some contradictions in the government anti-recession policy, activities of the financial sector and natural monopolies lead to pumping out added value created in manufacturing and agriculture, increase symptoms of the «Dutch disease», etc. It is shown that it may threaten the balanced development of the Russian economy, and a set of measures is suggested to minimize these tendencies and create a basis for the state modernization policy.


2019 ◽  
Vol 10 (1) ◽  
pp. 1-27
Author(s):  
Aniek Wijayanti

Business Process Analysis can be used to eliminate or reduce a waste cost caused by non value added activities that exist in a process. This research aims at evaluating activities carried out in the natural material procurement process in the PT XYZ, calculating the effectiveness of the process cycle, finding a way to improve the process management, and calculating the cost reduction that can achieved by activity management. A case study was the approach of this research. The researcher obtained research data throughout deep interviews with the staff who directly involved in the process, observation, and documentation of natural material procurement. The result of this study show that the effectiveness of the process cycle of natural material procurement in the factory reached as much as 87,1% for the sand material and 72% for the crushed stone. This indicates that the process still carry activities with no added value and still contain ineffective costs. Through the Business Process Mechanism, these non value added activities can be managed so that the process cycle becomes more efficient and cost effectiveness is achieved. The result of the effective cycle calculation after the management activities implementation is 100%. This means that the cost of natural material procurement process has become effective. The result of calculation of the estimated cost reduction as a result of management activity is as much as Rp249.026.635,90 per year.


Sign in / Sign up

Export Citation Format

Share Document