scholarly journals RiD: Uma Nova Abordagem para o Cálculo do Risco de Insolvência

2014 ◽  
Vol 12 (2) ◽  
pp. 229
Author(s):  
Marco Aurélio Dos Santos Sanfins ◽  
Danilo Soares Monte-Mor

Given the recent international crises and the increasing number of defaults, several researchers have attempted to develop metrics that calculate the probability of insolvency with higher accuracy. The approaches commonly used, however, do not consider the credit risk nor the severity of the distance between receivables and obligations among different periods. In this paper we mathematically present an approach that allow us to estimate the insolvency risk by considering not only future receivables and obligations, but the severity of the distance between them and the quality of the respective receivables. Using Monte Carlo simulations and hypothetical examples, we show that our metric is able to estimate the insolvency risk with high accuracy. Moreover, our results suggest that in the absence of a smooth distribution between receivables and obligations, there is a non-null insolvency risk even when the present value of receivables is larger than the present value of the obligations.

Author(s):  
Sébastien Fouques ◽  
Ole Andreas Hermundstad

The paper is concerned with the launch of free-fall lifeboats (FFL). It proposes a method that complies with the DNV-OS-E406 standard in order to select characteristic launches from Monte Carlo simulations for further structural load assessment with CFD and FEM. Proxy variables derived from kinematic parameters and aiming at predicting pressure load indicators are computed with the VARUNA launch simulator developed by MARINTEK. The statistical distributions of the proxy variables obtained from the Monte Carlo simulations are used to identify critical scenarios, and characteristic launches can then be selected from a chosen probability level. The feasibility of the proposed method is documented in the paper for several types of pressure loads. Existing model test data from various FFL-launch campaigns in calm water and in waves are used to compute the proxy variables as it would be done in the VARUNA simulator. Scatter diagrams showing the correlation with actual measured pressure load indicators are then established to assess the quality of the chosen proxy variables.


2019 ◽  
Vol 11 (15) ◽  
pp. 4193 ◽  
Author(s):  
Elariane ◽  
Dubé

The smart cities are considered to be an engine of economic and social growth. Most countries started to convert their existing cities into smart cities or construct new smart cities in order to improve the quality of life of their inhabitants. However, the problem that facing those countries while applying the concept of smart cities is the costs, especially for the residential sector. Despite the high initial and even operation costs for adopting different technologies in smart housing; the benefits could exceed those costs within the lifespan of the project. This article is shedding the light on the economics of smart housing. This study aims to evaluate the net present value (NPV) of a smart economic housing model to check the viability and feasibility of such projects. The calculation of the NPV based on Monte Carlo simulation provides an interesting methodological framework to evaluate the robustness of the results as well as providing a simple way to test for statistical significance of the results. This analysis helps to evaluate the potential profitability of smart housing solutions. The research ends up by proving the feasibility of this type of project.


1996 ◽  
Vol 07 (03) ◽  
pp. 295-303 ◽  
Author(s):  
P. D. CODDINGTON

Large-scale Monte Carlo simulations require high-quality random number generators to ensure correct results. The contrapositive of this statement is also true — the quality of random number generators can be tested by using them in large-scale Monte Carlo simulations. We have tested many commonly-used random number generators with high precision Monte Carlo simulations of the 2-d Ising model using the Metropolis, Swendsen-Wang, and Wolff algorithms. This work is being extended to the testing of random number generators for parallel computers. The results of these tests are presented, along with recommendations for random number generators for high-performance computers, particularly for lattice Monte Carlo simulations.


2012 ◽  
Vol 1471 ◽  
Author(s):  
Pierre-Emmanuel Berche ◽  
Saoussen Djedai ◽  
Etienne Talbot

ABSTRACTMonte Carlo simulations are used to perform an atomic scale modelling of the magnetic properties of epitaxial exchange-coupled DyFe2/YFe2 superlattices. These samples, extremely well-researched experimentally, are constituted by a hard ferrimagnet DyFe2 and a soft ferrimagnet YFe2 antiferromagnetically coupled. Depending on the layers and on the temperature, the field dependence of the magnetization depth profile is complex. In this work, we reproduce by Monte Carlo simulations hysteresis loops for the net and compound-specific magnetizations at different temperatures, and assess the quality of the results by a direct comparison to experimental hysteresis loops.


2012 ◽  
Vol 23 (2) ◽  
pp. 199-207 ◽  
Author(s):  
Michael L. Nieswiadomy

Abstract This note analyzes the risk and reward of investing the present value of a 40-year worklife of lost earnings (of $10,000 per year), discounted using rates of returns on various portfolios. Eight portfolios are examined: 100% in Treasury bills; 100% in intermediate-term government bonds; 100% in corporate bonds; four mixtures of the S&P 500, intermediate-term government bonds, and Treasury bills; and 100% in the S&P 500. The rates of return on the portfolios and the growth rate in hourly earnings are randomly selected from a year in the 1965–2010 period. The results of 10,000 Monte Carlo simulations indicate that a 40-year portfolio will face “ruin” roughly 51% to 52% of the time for all portfolios. However, the portfolios differ greatly in the median year of ruin (if ruin occurs), ranging from a high of the 38th year for a 100% Treasury bills portfolio, to the 22nd year for a 100% S&P 500 portfolio. The percent of time that the award greatly enriches (with an ending balance over $1,000,000) the plaintiff varies greatly as well. A 100% S&P 500 portfolio enriches the plaintiff 36.8% of the time; a portfolio of 30% S&P 500, 30% intermediate government bonds, and 40% Treasury bills enriches 14.5% of the time; while a 100% Treasury bills portfolio will virtually never enrich.


1994 ◽  
Vol 66 (7) ◽  
pp. 937-943 ◽  
Author(s):  
Hans R. Keller ◽  
Juergen. Roettele ◽  
Hermann. Bartels

2014 ◽  
Vol 38 (5) ◽  
pp. 471-479 ◽  
Author(s):  
Alexander M. Schoemann ◽  
Patrick Miller ◽  
Sunthud Pornprasertmanit ◽  
Wei Wu

Planned missing data designs allow researchers to increase the amount and quality of data collected in a single study. Unfortunately, the effect of planned missing data designs on power is not straightforward. Under certain conditions using a planned missing design will increase power, whereas in other situations using a planned missing design will decrease power. Thus, when designing a study utilizing planned missing data researchers need to perform a power analysis. In this article, we describe methods for power analysis and sample size determination for planned missing data designs using Monte Carlo simulations. We also describe a new, more efficient method of Monte Carlo power analysis, software that can be used in these approaches, and several examples of popular planned missing data designs.


2014 ◽  
Vol 12 ◽  
pp. 75-81 ◽  
Author(s):  
C. Brugger ◽  
S. Weithoffer ◽  
C. de Schryver ◽  
U. Wasenmüller ◽  
N. Wehn

Abstract. Powerful compute clusters and multi-core systems have become widely available in research and industry nowadays. This boost in utilizable computational power tempts people to run compute-intensive tasks on those clusters, either for speed or accuracy reasons. Especially Monte Carlo simulations with their inherent parallelism promise very high speedups. Nevertheless, the quality of Monte Carlo simulations strongly depends on the quality of the employed random numbers. In this work we present a comprehensive analysis of state-of-the-art pseudo random number generators like the MT19937 or the WELL generator used for parallel stream generation in different settings. These random number generators can be realized in hardware as well as in software and help to accelerate the analysis (or simulation) of communications systems. We show that it is possible to generate high-quality parallel random number streams with both generators, as long as some configuration constraints are met. We furthermore depict that distributed simulations with those generator types are viable even to very high degrees of parallelism.


Sign in / Sign up

Export Citation Format

Share Document