Structural Integrity Assessment of Free-Fall Lifeboats by Combining Fast Monte-Carlo Simulations With CFD by Means of Proxy Load Variables

Author(s):  
Sébastien Fouques ◽  
Ole Andreas Hermundstad

The paper is concerned with the launch of free-fall lifeboats (FFL). It proposes a method that complies with the DNV-OS-E406 standard in order to select characteristic launches from Monte Carlo simulations for further structural load assessment with CFD and FEM. Proxy variables derived from kinematic parameters and aiming at predicting pressure load indicators are computed with the VARUNA launch simulator developed by MARINTEK. The statistical distributions of the proxy variables obtained from the Monte Carlo simulations are used to identify critical scenarios, and characteristic launches can then be selected from a chosen probability level. The feasibility of the proposed method is documented in the paper for several types of pressure loads. Existing model test data from various FFL-launch campaigns in calm water and in waves are used to compute the proxy variables as it would be done in the VARUNA simulator. Scatter diagrams showing the correlation with actual measured pressure load indicators are then established to assess the quality of the chosen proxy variables.

1996 ◽  
Vol 07 (03) ◽  
pp. 295-303 ◽  
Author(s):  
P. D. CODDINGTON

Large-scale Monte Carlo simulations require high-quality random number generators to ensure correct results. The contrapositive of this statement is also true — the quality of random number generators can be tested by using them in large-scale Monte Carlo simulations. We have tested many commonly-used random number generators with high precision Monte Carlo simulations of the 2-d Ising model using the Metropolis, Swendsen-Wang, and Wolff algorithms. This work is being extended to the testing of random number generators for parallel computers. The results of these tests are presented, along with recommendations for random number generators for high-performance computers, particularly for lattice Monte Carlo simulations.


2012 ◽  
Vol 1471 ◽  
Author(s):  
Pierre-Emmanuel Berche ◽  
Saoussen Djedai ◽  
Etienne Talbot

ABSTRACTMonte Carlo simulations are used to perform an atomic scale modelling of the magnetic properties of epitaxial exchange-coupled DyFe2/YFe2 superlattices. These samples, extremely well-researched experimentally, are constituted by a hard ferrimagnet DyFe2 and a soft ferrimagnet YFe2 antiferromagnetically coupled. Depending on the layers and on the temperature, the field dependence of the magnetization depth profile is complex. In this work, we reproduce by Monte Carlo simulations hysteresis loops for the net and compound-specific magnetizations at different temperatures, and assess the quality of the results by a direct comparison to experimental hysteresis loops.


1994 ◽  
Vol 66 (7) ◽  
pp. 937-943 ◽  
Author(s):  
Hans R. Keller ◽  
Juergen. Roettele ◽  
Hermann. Bartels

2006 ◽  
Vol 321-323 ◽  
pp. 336-339 ◽  
Author(s):  
Mi Ra Cho ◽  
Ki Bong Kim ◽  
Sung Ho Joh ◽  
Tae Ho Kang

The impact-echo method, which is to evaluate the integrity of concrete and masonry structures nondestructively, is an excellent method in practical applications, and provides a high quality of structural integrity assessment. However, in the case of multi-layered systems in which each layer has different stiffness, the impact-echo method may lack reliability in thickness evaluation, which demands improvement of the impact-echo method. This study was first dedicated to the understanding of stress-wave propagation in the impact-echo test, and secondly, the reliability of the impact-echo method was investigated through the numerical simulation of the impact-echo test. The investigation included the research on influencing factors such as stiffness contrast between layers and receiver location. Finally, the research in this paper led to the development of the phase-difference response (PDR) method, based on the frequency response between two receivers deployed in a line with an impact source.


2014 ◽  
Vol 38 (5) ◽  
pp. 471-479 ◽  
Author(s):  
Alexander M. Schoemann ◽  
Patrick Miller ◽  
Sunthud Pornprasertmanit ◽  
Wei Wu

Planned missing data designs allow researchers to increase the amount and quality of data collected in a single study. Unfortunately, the effect of planned missing data designs on power is not straightforward. Under certain conditions using a planned missing design will increase power, whereas in other situations using a planned missing design will decrease power. Thus, when designing a study utilizing planned missing data researchers need to perform a power analysis. In this article, we describe methods for power analysis and sample size determination for planned missing data designs using Monte Carlo simulations. We also describe a new, more efficient method of Monte Carlo power analysis, software that can be used in these approaches, and several examples of popular planned missing data designs.


2014 ◽  
Vol 12 ◽  
pp. 75-81 ◽  
Author(s):  
C. Brugger ◽  
S. Weithoffer ◽  
C. de Schryver ◽  
U. Wasenmüller ◽  
N. Wehn

Abstract. Powerful compute clusters and multi-core systems have become widely available in research and industry nowadays. This boost in utilizable computational power tempts people to run compute-intensive tasks on those clusters, either for speed or accuracy reasons. Especially Monte Carlo simulations with their inherent parallelism promise very high speedups. Nevertheless, the quality of Monte Carlo simulations strongly depends on the quality of the employed random numbers. In this work we present a comprehensive analysis of state-of-the-art pseudo random number generators like the MT19937 or the WELL generator used for parallel stream generation in different settings. These random number generators can be realized in hardware as well as in software and help to accelerate the analysis (or simulation) of communications systems. We show that it is possible to generate high-quality parallel random number streams with both generators, as long as some configuration constraints are met. We furthermore depict that distributed simulations with those generator types are viable even to very high degrees of parallelism.


Author(s):  
Anders Veldt Soborg ◽  
Peter Friis Hansen

The assessment of a ship’s intact stability is traditionally based on a semi-empirical deterministic concept that evaluates the characteristics of ship’s calm water restoring leverarm curves. Today the ship is considered safe with respect to dynamic stability if its calm water leverarm curves exhibit sufficient characteristics with respect to slope at zero heel (GM value), maximum leverarm, positive range of stability and area below the leverarm curve. The rule-based requirements to calm water leverarm curves are entirely based on experience obtained from vessels in operation and recorded accidents in the past. The rules therefore only leaves little room for evaluation and improvement of safety of a ship’s dynamic stability. A few studies have evaluated the probability of ship stability loss in waves using Monte Carlo simulations. However, since this probability may be in the order of 10−4 per ship year such brute force Monte-Carlo simulations are not always feasible due to the required computational resources. Previous studies of dynamic stability of ships in waves typically focused on the capsizing event. In this study the objective is to establish a procedure that can identify “critical wave patterns” that most likely will lead to the occurrence of a considered adverse event. Examples of such adverse events are stability loss, loss of maneuverability, cargo damage, and seasickness. The adverse events related to dynamic stability are considered as a function of the roll angle, the roll velocity, and the roll acceleration. This study will therefore describe how considered adverse events can be combined into a single utility function that in its scale expresses different magnitudes of the criticality (or assessed consequences) of the adverse events. It will be illustrated how the distribution of the exceedance probability may be established by an estimation of the out-crossing rate of the “safe set” defined by the utility function. This out-crossing rate will be established using the so-called Madsen’s Formula. A bi-product of this analysis is a set of short wave time series that at different exceedance levels may be used in a codified evaluation of a vessels intact stability in waves.


2014 ◽  
Vol 12 (2) ◽  
pp. 229
Author(s):  
Marco Aurélio Dos Santos Sanfins ◽  
Danilo Soares Monte-Mor

Given the recent international crises and the increasing number of defaults, several researchers have attempted to develop metrics that calculate the probability of insolvency with higher accuracy. The approaches commonly used, however, do not consider the credit risk nor the severity of the distance between receivables and obligations among different periods. In this paper we mathematically present an approach that allow us to estimate the insolvency risk by considering not only future receivables and obligations, but the severity of the distance between them and the quality of the respective receivables. Using Monte Carlo simulations and hypothetical examples, we show that our metric is able to estimate the insolvency risk with high accuracy. Moreover, our results suggest that in the absence of a smooth distribution between receivables and obligations, there is a non-null insolvency risk even when the present value of receivables is larger than the present value of the obligations.


Sign in / Sign up

Export Citation Format

Share Document