A new geometrical method for portfolio optimization

2021 ◽  
Vol 8 (3) ◽  
pp. 400-409
Author(s):  
F. Butin ◽  

Risk aversion plays a significant and central role in investors’ decisions in the process of developing a portfolio. In this portfolio optimization framework, we determine the portfolio that possesses the minimal risk by using a new geometrical method. For this purpose, we elaborate an algorithm that enables us to compute any Euclidean distance to a standard simplex. With this new approach, we can treat the case of portfolio optimization without short-selling in its entirety, and we also recover in geometrical terms the well-known results on portfolio optimization with allowed short-selling. Then, we apply our results to determine which convex combination of the CAC 40 stocks possesses the lowest risk. Thus, we not only obtain a very low risk compared to the index, but we also get a rate of return that is almost three times better than the one of the index.

2015 ◽  
Vol 2015 ◽  
pp. 1-11 ◽  
Author(s):  
Da-hai Dai ◽  
Jing-ke Zhang ◽  
Xue-song Wang ◽  
Shun-ping Xiao

This paper presented a new approach to superresolution ISAR imaging based on a scattering model called coherent polarized geometrical theory of diffraction (CP-GTD) which is better matched to the physical scattering mechanism. The algorithm is a joint processing between polarization and superresolution essentially. It can also estimate the number, position, frequency dependence, span, and normalized scattering matrix of scattering centers instantaneously for each channel rather than the one which extracts parameters from each channel separately, and its performance is better than the latter because the fully polarized information is used. The superiority of the CP-GTD is verified by experiment results based on simulated and real data.


2013 ◽  
Vol 760-762 ◽  
pp. 2151-2155
Author(s):  
Zhong Jie Zhang ◽  
Jian Huang ◽  
Zhi Jia Wang

Time Synchronization is a key technique which synchronizes the LPs (Logic processes) of PDES (Parallel and Discrete Event Simulation). After having described shortcomings of the existing algorithms for Time Synchronization, this thesis mainly proposes a new approach based on time dams. It introduces the concept of Time Dams Algorithm and presents its realization. Moreover, the performance of TD (Time Dams) Algorithm is compared with the one of TW (Time Warp) Algorithm through the use of PHOLD model in experiments. Finally, the experiment results state that TD performs better than TW under most conditions.


2011 ◽  
Vol 101-102 ◽  
pp. 496-499
Author(s):  
You Cheng Tong ◽  
Yang Zhang ◽  
Jun Zhou Yao

In order to develop an automated segmentation system for jacquard fabric images, a new approach based on MRMRF algorithm with variable weighing parameter is proposed in this paper. Firstly the variable weighting parameter different to the one in traditional MRMRF is described, which can provide a more accurate vector. The next step is MAP estimation and the model for texture segmentation. During this iterative process the initial value is big enough to learn more accurate parameters of feature energy. With the iterative number going on, the value will decrease and stop decreasing when the iterative number comes to some degree. Lastly the experiment results show that the new approach works better than the traditional method with constant weighing parameter.


2001 ◽  
Vol 32 (3) ◽  
pp. 133-141 ◽  
Author(s):  
Gerrit Antonides ◽  
Sophia R. Wunderink

Summary: Different shapes of individual subjective discount functions were compared using real measures of willingness to accept future monetary outcomes in an experiment. The two-parameter hyperbolic discount function described the data better than three alternative one-parameter discount functions. However, the hyperbolic discount functions did not explain the common difference effect better than the classical discount function. Discount functions were also estimated from survey data of Dutch households who reported their willingness to postpone positive and negative amounts. Future positive amounts were discounted more than future negative amounts and smaller amounts were discounted more than larger amounts. Furthermore, younger people discounted more than older people. Finally, discount functions were used in explaining consumers' willingness to pay for an energy-saving durable good. In this case, the two-parameter discount model could not be estimated and the one-parameter models did not differ significantly in explaining the data.


2008 ◽  
Vol 67 (1) ◽  
pp. 51-60 ◽  
Author(s):  
Stefano Passini

The relation between authoritarianism and social dominance orientation was analyzed, with authoritarianism measured using a three-dimensional scale. The implicit multidimensional structure (authoritarian submission, conventionalism, authoritarian aggression) of Altemeyer’s (1981, 1988) conceptualization of authoritarianism is inconsistent with its one-dimensional methodological operationalization. The dimensionality of authoritarianism was investigated using confirmatory factor analysis in a sample of 713 university students. As hypothesized, the three-factor model fit the data significantly better than the one-factor model. Regression analyses revealed that only authoritarian aggression was related to social dominance orientation. That is, only intolerance of deviance was related to high social dominance, whereas submissiveness was not.


Author(s):  
J. E. Smyth

During the early 1940s, journalists observed that after years of men controlling women’s fashion, Hollywood had become “a fashion center in which women designers are getting to be a big power.” In a town where “the working girl is queen,” it was women who really knew how to dress working women. Edith Head’s name dominates Hollywood costume design. Though a relatively poor sketch artist who refused to sew in public, Head understood what the average woman wanted to wear and knew better than anyone how to craft her image as the-one-and-only Edith Head. However, she was one of many women who designed Hollywood glamour in the studio era. This chapter juxtaposes Head’s career with that of a younger, fiercely independent designer who would quickly upstage Head as a creative force. In many senses, Dorothy Jeakins’s postwar career ascent indicated the waning of the Hollywood system and the powerful relationship between female designers, stars, and fans.


2021 ◽  
Vol 11 (10) ◽  
pp. 4575
Author(s):  
Eduardo Fernández ◽  
Nelson Rangel-Valdez ◽  
Laura Cruz-Reyes ◽  
Claudia Gomez-Santillan

This paper addresses group multi-objective optimization under a new perspective. For each point in the feasible decision set, satisfaction or dissatisfaction from each group member is determined by a multi-criteria ordinal classification approach, based on comparing solutions with a limiting boundary between classes “unsatisfactory” and “satisfactory”. The whole group satisfaction can be maximized, finding solutions as close as possible to the ideal consensus. The group moderator is in charge of making the final decision, finding the best compromise between the collective satisfaction and dissatisfaction. Imperfect information on values of objective functions, required and available resources, and decision model parameters are handled by using interval numbers. Two different kinds of multi-criteria decision models are considered: (i) an interval outranking approach and (ii) an interval weighted-sum value function. The proposal is more general than other approaches to group multi-objective optimization since (a) some (even all) objective values may be not the same for different DMs; (b) each group member may consider their own set of objective functions and constraints; (c) objective values may be imprecise or uncertain; (d) imperfect information on resources availability and requirements may be handled; (e) each group member may have their own perception about the availability of resources and the requirement of resources per activity. An important application of the new approach is collective multi-objective project portfolio optimization. This is illustrated by solving a real size group many-objective project portfolio optimization problem using evolutionary computation tools.


2020 ◽  
pp. 1-16
Author(s):  
Meriem Khelifa ◽  
Dalila Boughaci ◽  
Esma Aïmeur

The Traveling Tournament Problem (TTP) is concerned with finding a double round-robin tournament schedule that minimizes the total distances traveled by the teams. It has attracted significant interest recently since a favorable TTP schedule can result in significant savings for the league. This paper proposes an original evolutionary algorithm for TTP. We first propose a quick and effective constructive algorithm to construct a Double Round Robin Tournament (DRRT) schedule with low travel cost. We then describe an enhanced genetic algorithm with a new crossover operator to improve the travel cost of the generated schedules. A new heuristic for ordering efficiently the scheduled rounds is also proposed. The latter leads to significant enhancement in the quality of the schedules. The overall method is evaluated on publicly available standard benchmarks and compared with other techniques for TTP and UTTP (Unconstrained Traveling Tournament Problem). The computational experiment shows that the proposed approach could build very good solutions comparable to other state-of-the-art approaches or better than the current best solutions on UTTP. Further, our method provides new valuable solutions to some unsolved UTTP instances and outperforms prior methods for all US National League (NL) instances.


Author(s):  
Ajay Andrew Gupta

AbstractThe widespread proliferation of and interest in bracket pools that accompany the National Collegiate Athletic Association Division I Men’s Basketball Tournament have created a need to produce a set of predicted winners for each tournament game by people without expert knowledge of college basketball. Previous research has addressed bracket prediction to some degree, but not nearly on the level of the popular interest in the topic. This paper reviews relevant previous research, and then introduces a rating system for teams using game data from that season prior to the tournament. The ratings from this system are used within a novel, four-predictor probability model to produce sets of bracket predictions for each tournament from 2009 to 2014. This dual-proportion probability model is built around the constraint of two teams with a combined 100% probability of winning a given game. This paper also performs Monte Carlo simulation to investigate whether modifications are necessary from an expected value-based prediction system such as the one introduced in the paper, in order to have the maximum bracket score within a defined group. The findings are that selecting one high-probability “upset” team for one to three late rounds games is likely to outperform other strategies, including one with no modifications to the expected value, as long as the upset choice overlaps a large minority of competing brackets while leaving the bracket some distinguishing characteristics in late rounds.


2000 ◽  
Vol 11 (3) ◽  
pp. 261-264 ◽  
Author(s):  
Tricia S. Clement ◽  
Thomas R. Zentall

We tested the hypothesis that pigeons could use a cognitively efficient coding strategy by training them on a conditional discrimination (delayed symbolic matching) in which one alternative was correct following the presentation of one sample (one-to-one), whereas the other alternative was correct following the presentation of any one of four other samples (many-to-one). When retention intervals of different durations were inserted between the offset of the sample and the onset of the choice stimuli, divergent retention functions were found. With increasing retention interval, matching accuracy on trials involving any of the many-to-one samples was increasingly better than matching accuracy on trials involving the one-to-one sample. Furthermore, following this test, pigeons treated a novel sample as if it had been one of the many-to-one samples. The data suggest that rather than learning each of the five sample-comparison associations independently, the pigeons developed a cognitively efficient single-code/default coding strategy.


Sign in / Sign up

Export Citation Format

Share Document