Averaging Across Asset Allocation Models

2015 ◽  
Vol 235 (1) ◽  
pp. 61-81 ◽  
Author(s):  
Peter Schanbacher

Summary Combination of asset allocation models is rewarding if (i) the applied risk function is concave and (ii) there is no dominating model. We show that most common risk functions are either concave or at least concave in common applications. In a comprehensive empirical study using standard asset allocation models we find that there is no constantly dominating model. The ranking of the models depends on the data set, the risk function and even changes over time. We find that a simple average of all asset allocation models can outperform each individual model. Our contribution is twofold. We present a theory why the combined model is expected to dominate most individual models. In a comprehensive empirical study we show that model combinations perform exceptionally well in asset allocation.

2009 ◽  
Vol 25 (1) ◽  
pp. 1-12 ◽  
Author(s):  
C. Jessica E. Metcalf ◽  
James S. Clark ◽  
Deborah A. Clark

Abstract:Estimation of tree growth is generally based on repeated diameter measurements. A buttress at the height of measurement will lead to overestimates of tree diameter. Because buttresses grow up the trunk through time, it has become common practice to increase the height of measurement, to ensure that measurements remain above the buttress. However, tapering of the trunk means that increasing measurement height will bias estimates of diameter downward by up to 10% per m of height. This bias could affect inference concerning species differences and climate effects on tree demography and on biomass accumulation. Here we introduce a hierarchical state space method that allows formal integration of data on diameter taken at different heights and can include individual variation, temporal effects or other covariates. We illustrate our approach using species from Barro Colorado Island, Panama, and La Selva, Costa Rica. Results include trends that are consistent with some of those previously reported for climate responses and changes over time, but differ in relative magnitude. By including the full data-set and accounting for bias and variation among individuals and over time, our approach allows for quantification of climate responses and the uncertainty associated with measurements and the underlying growth process.


2015 ◽  
Vol 28 (2) ◽  
pp. 189-210 ◽  
Author(s):  
GREGORY SHAFFER

AbstractThe New Legal Realist approach to international law builds from a jurisprudential tradition that asks how actors use and apply law in order to understand how law obtains meaning, is practised, and changes over time. The article addresses the jurisprudential roots of the New Legal Realism, its core attributes, and six important components in the current transnational context. In the pragmatist tradition, the New Legal Realism is both empirical and problem-centred, attending to both context and legal normativity. What is new is the rise of transnational activity that gives rise to an enlarged scope of transnational problem-solving through international law in radically new ways across areas of law, and the growth of empirical study of these phenomena. The article concludes by addressing the potential risks of the New Legal Realist approach in terms of scientism and relativism, and it responds to them.


2020 ◽  
Vol 34 (02) ◽  
pp. 2236-2243 ◽  
Author(s):  
Weiran Shen ◽  
Binghui Peng ◽  
Hanpeng Liu ◽  
Michael Zhang ◽  
Ruohan Qian ◽  
...  

In many social systems in which individuals and organizations interact with each other, there can be no easy laws to govern the rules of the environment, and agents' payoffs are often influenced by other agents' actions. We examine such a social system in the setting of sponsored search auctions and tackle the search engine's dynamic pricing problem by combining the tools from both mechanism design and the AI domain. In this setting, the environment not only changes over time, but also behaves strategically. Over repeated interactions with bidders, the search engine can dynamically change the reserve prices and determine the optimal strategy that maximizes the profit. We first train a buyer behavior model, with a real bidding data set from a major search engine, that predicts bids given information disclosed by the search engine and the bidders' performance data from previous rounds. We then formulate the dynamic pricing problem as an MDP and apply a reinforcement-based algorithm that optimizes reserve prices over time. Experiments demonstrate that our model outperforms static optimization strategies including the ones that are currently in use as well as several other dynamic ones.


2014 ◽  
Vol 19 (2) ◽  
pp. 221-244 ◽  
Author(s):  
Jack Hoeksema

Abstract This paper presents the results of a corpus study of Dutch complement PPs. On the basis of a collection of 3400 occurrences in negative sentences, the four major word order patterns (regular position, scrambling order, topicalization and extraposition) are studied, both in main and subordinate clauses, and linked to the properties of the prepositional phrases, in particular weight and definiteness. Greater weight corresponds to higher likelihood of extraposition, and definiteness to higher likelihood of scrambling and topicalization. This corresponds well with earlier studies of word order variation in Dutch, but had not been established for the class of complement PPs. Among definite phrases, PPs with so-called R-pronouns, such as hieraan ‘here-on’ and daarvan ‘thereof’ showed especially high preferences for topicalization and scrambling. Negative sentences were selected for this study to avoid cases where regular order and scrambling order could not be distinguished due to lack of adverbial elements in the middle field. The data set is temporally stratified. This made it possible to study changes over time, and the most robust finding was a continuous retreat of the scrambling order throughout the period 1700-2014.


2020 ◽  
Vol 13 (10) ◽  
pp. 108
Author(s):  
Tobias F. Rötheli

This study assesses the accuracy of forecasts by industry branches. Such an investigation provides a perspective on the relative benefits of forecasting in different industries. Accuracy of forecasting is assessed by econometrically investigating expectations data on firms’ production drawn from surveys covering manufacturing. Such data is available for only few countries and few historical periods. We study U.S. data covering the 1980s and German data over the period from 1991 to 2018. We first present rankings of industries according to forecast accuracy for both countries. Then the historical gap between the two countries’ data set is put to use to assess the stability and the dynamics in the relevance of forecasting in different branches of industry. We identify several industries that – across time and place – are among the most (e.g., electric machinery) and least accurate forecasters (e.g., the food industry). By contrast in some industries forecasting performance appear to undergo noticeable changes over time: the reported evidence suggests that forecasting has lost some of its potential in the printing and textile industries while gaining over time in the nonelectric machinery and in the metals industry. The findings can help management to make decisions regarding the allocation of resources to forecasting.


2019 ◽  
Vol 28 (1) ◽  
pp. 87-111 ◽  
Author(s):  
Emma Rodman

Word vectorization is an emerging text-as-data method that shows great promise for automating the analysis of semantics—here, the cultural meanings of words—in large volumes of text. Yet successes with this method have largely been confined to massive corpora where the meanings of words are presumed to be fixed. In political science applications, however, many corpora are comparatively small and many interesting questions hinge on the recognition that meaning changes over time. Together, these two facts raise vexing methodological challenges. Can word vectors trace the changing cultural meanings of words in typical small corpora use cases? I test four time-sensitive implementations of word vectors (word2vec) against a gold standard developed from a modest data set of 161 years of newspaper coverage. I find that one implementation method clearly outperforms the others in matching human assessments of how public dialogues around equality in America have changed over time. In addition, I suggest best practices for using word2vec to study small corpora for time series questions, including bootstrap resampling of documents and pretraining of vectors. I close by showing that word2vec allows granular analysis of the changing meaning of words, an advance over other common text-as-data methods for semantic research questions.


Author(s):  
José Ignacio Nazif-Munoz ◽  
Rose Chabot

AbstractSexual and reproductive health and rights policies (SRHRPs) and their association with reproductive and non-reproductive behavior require precise theoretical and methodological frames. By studying the case of Colombia, we move forward with a comprehensive framework that considers simultaneously multiple SRHRP conceptualizations and their impacts over time on induced pregnancy terminations (IPT). With a mixed-method approach, we first map the evolution of SRHRPs and then analyze their direct and indirect effects on IPTs, using the provision of contraceptive methods by the government, female use of contraceptive methods, and conversations with health professionals in a mediation approach. We build a unique data set from more than 2100 policy documents, and then use data on 81,760 women (20–40 years) from four waves (2000–2015) of Colombia’s Demographic and Health Surveys. We find that SRHRPs are directly associated with an 18% reduction in reported IPTs. Associations between these variables are explained by the increased use of modern contraceptive methods (6%), and the government’s provision of those contraceptive methods (13%). Studies interested in the impact of SRHRPs need to consider not only the direct effects of legal changes on abortion outcomes but also show changes over time may operate through different sub-programs embedded in these policies, such as access to contraceptive methods and family planning. This will add further nuances to how SRHRPs are both multilayered and implemented.


VASA ◽  
2015 ◽  
Vol 44 (5) ◽  
pp. 355-362 ◽  
Author(s):  
Marie Urban ◽  
Alban Fouasson-Chailloux ◽  
Isabelle Signolet ◽  
Christophe Colas Ribas ◽  
Mathieu Feuilloy ◽  
...  

Abstract. Summary: Background: We aimed at estimating the agreement between the Medicap® (photo-optical) and Radiometer® (electro-chemical) sensors during exercise transcutaneous oxygen pressure (tcpO2) tests. Our hypothesis was that although absolute starting values (tcpO2rest: mean over 2 minutes) might be different, tcpO2-changes over time and the minimal value of the decrease from rest of oxygen pressure (DROPmin) results at exercise shall be concordant between the two systems. Patients and methods: Forty seven patients with arterial claudication (65 + / - 7 years) performed a treadmill test with 5 probes each of the electro-chemical and photo-optical devices simultaneously, one of each system on the chest, on each buttock and on each calf. Results: Seventeen Medicap® probes disconnected during the tests. tcpO2rest and DROPmin values were higher with Medicap® than with Radiometer®, by 13.7 + / - 17.1 mm Hg and 3.4 + / - 11.7 mm Hg, respectively. Despite the differences in absolute starting values, changes over time were similar between the two systems. The concordance between the two systems was approximately 70 % for classification of test results from DROPmin. Conclusions: Photo-optical sensors are promising alternatives to electro-chemical sensors for exercise oximetry, provided that miniaturisation and weight reduction of the new sensors are possible.


2007 ◽  
Author(s):  
Miranda Olff ◽  
Mirjam Nijdam ◽  
Kristin Samuelson ◽  
Julia Golier ◽  
Mariel Meewisse ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document