Influence of Ice Floe Shape and Distribution on Ship Resistance

Author(s):  
Marc Cahay ◽  
Gabriel Fabiano de Lima

Abstract Many ice basin tests have been performed on ships to assess the ice resistance of the hull in an ice floe field. During these tests, many parameters are studied, the most important of them being the transit speed, the thickness and the concentration of ice. Given the cost and the time required to carry out these basin test campaigns, it is imperative to keep the number of tests to the strict minimum, whilst still making it possible to draw conclusions about the sizing of the vessel. Hence, the influence of ice floe shape and their distribution in the field are generally not considered. A way to achieve sensitivity studies regarding these parameters is to use numerical simulations in addition to a basin test. There are few advanced numerical design tools available in the market, especially those able to cope with any kind of structure geometry and a large variety of ice interaction & failure mechanisms. In 2012 TechnipFMC, Cervval and Bureau Veritas initiated a common development program to offer a new tool for the design of offshore structures interacting with ice combining a variety of models and approaches such as analytical, numerical and empirical. This numerical tool called Ice-MAS (www.ice-mas.com) uses a multi-agent technology and has the possibility to combine, in a common framework, multiple phenomena from various natures and heterogeneous scales (i.e. drag, friction, ice-sheet bending failure, local crushing and rubble stack up). The study presented in this paper compares the simulation results for different ice floe fields not only in terms of concentration, maximum size of floe and their distribution but also in the way to generate the ice floe and its shape.

Author(s):  
Marc Cahay ◽  
Brian A. Roberts ◽  
Sami Sadouni ◽  
Pierre-Antoine Béal ◽  
Cyril Septseault ◽  
...  

In 2012 TechnipFMC, Cervval and Bureau Veritas initiated a common development program to offer a new tool for the design of offshore structures interacting with ice combining a variety of models and approaches. This numerical tool called Ice-MAS (www.ice-mas.com) is using a multi-agent technology and has the possibility to combine in a common framework multiple phenomena from various natures and heterogeneous scales (i.e. drag, friction, ice-sheet bending failure, local crushing and rubble stack up). It can simulate the ice loadings of a drifting ice-sheet (including ridge or not) on predefined structures such as conical, cylindrical, sloping & vertical wall, artificial islands or more complex geometry by user input file like semi-submersible floaters with pontoon and columns allowing to obtain the detailed results on the different parts of the structure. This paper presents the overall functionalities of Ice-MAS and the different possibilities to model a semi-submersible floater. It will focus on the results obtained for different geometries subject to ice sheet loading through different incidence angles. The issues related to the anchoring of the platform are addressed in a simplified way.


Author(s):  
Marc Cahay ◽  
Brian A. Roberts ◽  
Kenton Pike ◽  
Pierre-Antoine Béal ◽  
Cyril Septseault ◽  
...  

In 2012 TechnipFMC, Cervval and Bureau Veritas initiated a common development program to offer a new tool for the design of offshore structures interacting with ice combining a variety of models and approaches. This numerical tool called Ice-MAS (www.ice-mas.com) is using a multi-agent technology and has the possibility to combine in a common framework multiple phenomena from various natures and heterogeneous scales (i.e. drag, friction, ice-sheet bending failure, local crushing and rubble stack up). The current development phase consists of the determination of the forces generated by an iceberg during an impact on an offshore structure. This paper will provide an overview of the latest Ice-MAS development. It will introduce the main functionalities of the simulation tool and the different options for modelling an offshore structure. It will then focus on the modelling approach used for an iceberg, the calculation of the different hydrodynamic coefficients and their variability according to the separation distance from the structure. The model used to compute the impact load will be detailed, including the local crushing behavior which is simulated by a pressure-area correlation.


2018 ◽  
Vol 146 (7) ◽  
pp. 2247-2270 ◽  
Author(s):  
Sergey Frolov ◽  
Douglas R. Allen ◽  
Craig H. Bishop ◽  
Rolf Langland ◽  
Karl W. Hoppel ◽  
...  

Abstract The local ensemble tangent linear model (LETLM) provides an alternative method for creating the tangent linear model (TLM) and adjoint of a nonlinear model that promises to be easier to maintain and more computationally scalable than earlier methods. In this paper, we compare the ability of the LETLM to predict the difference between two nonlinear trajectories of the Navy’s global weather prediction model at low resolution (2.5° at the equator) with that of the TLM currently used in the Navy’s four-dimensional variational (4DVar) data assimilation scheme. When compared to the pair of nonlinear trajectories, the traditional TLM and the LETLM have improved skill relative to persistence everywhere in the atmosphere, except for temperature in the planetary boundary layer. In addition, the LETLM was, on average, more accurate than the traditional TLM (error reductions of about 20% in the troposphere and 10% overall). Sensitivity studies showed that the LETLM was most sensitive to the number of ensemble members, with the performance gradually improving with increased ensemble size up to the maximum size attempted (400). Inclusion of physics in the LETLM ensemble leads to a significantly improved representation of the boundary layer winds (error reductions of up to 50%), in addition to improved winds and temperature in the free troposphere and in the upper stratosphere/lower mesosphere. The computational cost of the LETLM was dominated by the cost of ensemble propagation. However, the LETLM can be precomputed before the 4DVar data assimilation algorithm is executed, leading to a significant computational advantage.


TAPPI Journal ◽  
2012 ◽  
Vol 11 (7) ◽  
pp. 29-35 ◽  
Author(s):  
PETER W. HART ◽  
DALE E. NUTTER

During the last several years, the increasing cost and decreasing availability of mixed southern hardwoods have resulted in financial and production difficulties for southern U.S. mills that use a significant percentage of hardwood kraft pulp. Traditionally, in the United States, hardwoods are not plantation grown because of the growth time required to produce a quality tree suitable for pulping. One potential method of mitigating the cost and supply issues associated with the use of native hardwoods is to grow eucalyptus in plantations for the sole purpose of producing hardwood pulp. However, most of the eucalyptus species used in pulping elsewhere in the world are not capable of surviving in the southern U.S. climate. This study examines the potential of seven different cold-tolerant eucalyptus species to be used as replacements for, or supplements to, mixed southern hardwoods. The laboratory pulping and bleaching aspects of these seven species are discussed, along with pertinent mill operational data. Selected mill trial data also are reviewed.


2015 ◽  
Vol 8 (2/3) ◽  
pp. 180-205 ◽  
Author(s):  
Alireza Jahani ◽  
Masrah Azrifah Azmi Murad ◽  
Md. Nasir bin Sulaiman ◽  
Mohd. Hasan Selamat

Purpose – The purpose of this paper is to propose an approach that integrates three complementary perspectives, multi-agent systems, fuzzy logic and case-based reasoning. Unsatisfied customers, information overload and high uncertainty are the main challenges that are faced by today’s supply chains. In addition, a few existing agent-based approaches are tied to real-world supply chain functions like supplier selection. These approaches are static and do not adequately take the qualitative and quantitative factors into consideration. Therefore, an agent-based framework is needed to address these issues. Design/methodology/approach – The proposed approach integrates three complementary perspectives, multi-agent systems, fuzzy logic and case-based reasoning, as a common framework. These perspectives were rarely used together as a common framework in previous studies. Furthermore, an exploratory case study in an office furniture company is undertaken to illustrate the value of the framework. Findings – The proposed agent-based framework evaluates supply offers based on customers’ preferences, recommends alternative products in the case of stock-out and provides a collaborative environment among agents who represent different supply chain entities. The proposed fuzzy case-based reasoning (F-CBR) approach reduces the information overload by organizing them into the relevant cases that causes less overall search between cases. In addition, its fuzzy aspect addresses the high uncertainty of supply chains, especially when there are different customers’ orders with different preferences. Research limitations/implications – The present study does not include the functions of inventory management and negotiation between agents. Furthermore, only the case description and case retrieval phases of the case-based reasoning approach are investigated, and the remaining phases like case retaining, case reusing and case revising are not included in the scope of this paper. Originality/value – This framework balances the interests of different supply chain structural elements where each of them is represented by a specific agent for better collaboration, decision-making and problem-solving in a multi-agent environment. In addition, the supplier selection and order gathering mechanisms are developed based on customers’ orders.


1992 ◽  
Vol 22 (7) ◽  
pp. 980-983 ◽  
Author(s):  
Richard G. Oderwald ◽  
Elizabeth Jones

Formulas are derived for determining the total number of sample points and the number of volume points for a point, double sample with a ratio of means estimator to replace a point sample and achieve the same variance. A minimum ratio of the cost of measuring volume to the cost of measuring basal area at a point is determined for which the point, double sample will be less costly, in terms of time required to measure points, than the point sample.


2017 ◽  
Author(s):  
Sonny S Bleicher

Landscapes of Fear (LOF), the spatially explicit distribution of perceived predation risk as seen by a population, is increasingly cited in ecological literature and has become a frequently used “buzz-word”. With the increase in popularity, it became necessary to clarify the definition for the term, suggest boundaries and propose a common framework for its use. The LOF, as a progeny of the “ecology of fear” conceptual framework, defines fear as the strategic manifest of the cost-benefit analysis of food and safety tradeoffs. In addition to direct predation risk, the LOF is affected by individuals’ energetic-state, inter- and intra-specific competition and is constrained by the evolutionary history of each species. Herein, based on current applications of the LOF conceptual framework, I suggest the future research in this framework will be directed towards: (1) finding applied management uses as a trait defining a population’s habitat-use and habitat-suitability; (2) studying multi-dimensional distribution of risk-assessment through time and space; (3) studying variability between individuals within a population; and (4) measuring eco-neurological implications of risk as a feature of environmental heterogeneity.


Quantum ◽  
2018 ◽  
Vol 2 ◽  
pp. 78 ◽  
Author(s):  
M. B. Hastings

We give a quantum algorithm to exactly solve certain problems in combinatorial optimization, including weighted MAX-2-SAT as well as problems where the objective function is a weighted sum of products of Ising variables, all terms of the same degree D; this problem is called weighted MAX-ED-LIN2. We require that the optimal solution be unique for odd D and doubly degenerate for even D; however, we expect that the algorithm still works without this condition and we show how to reduce to the case without this assumption at the cost of an additional overhead. While the time required is still exponential, the algorithm provably outperforms Grover's algorithm assuming a mild condition on the number of low energy states of the target Hamiltonian. The detailed analysis of the runtime dependence on a tradeoff between the number of such states and algorithm speed: fewer such states allows a greater speedup. This leads to a natural hybrid algorithm that finds either an exact or approximate solution.


1997 ◽  
Vol 13 (01) ◽  
pp. 57-73
Author(s):  
Michael Wade ◽  
Philip C. Koenig ◽  
Zbigniew J. Karaszewski ◽  
John Gallagher ◽  
John Dougherty ◽  
...  

The NAVSEA Mid-Term Sealift Ship Technology Development Program (MTSSTDP) has been tasked by the Office of the Chief of Naval Operations in charge of Strategic Sealift (N-42) to investigate technologies and design concepts that would improve performance and reduce the cost of future ships useful to the Navy for military sealift. A major area of concentration has been design-for-production or ship producibility. This area of the program required the Navy to obtain extensive industrial involvement from both domestic and international sources. A thorough description Is presented of the plans, objectives and accomplishments of the five producibility-related tasks. Topics covered are: generic build strategy, product work breakdown structure, production-oriented cost estimating, engine room arrangement modeling, and global standards development. In addition, other aspects of the program to be covered are the use of industry-led teams, the implementation of integrated product approaches, and the application of risk-based technologies.


1973 ◽  
Vol 10 (04) ◽  
pp. 334-363
Author(s):  
Peter M. Kimon ◽  
Ronald K. Kiss ◽  
Joseph D. Porricelli

This paper presents the results of a study of very large crude carriers (VLCCs) to determine the cost and effectiveness of variations in the capacity of segregated ballast and variations in arrangement expected to reduce oil pollution due to operational and accidental causes. The arrangements considered include double bottoms, double sides, double skin, and alternating cargo and ballast wing tanks. The paper concentrates on a series of 250,000-dwt tankers, but does consider the influence of size by including results for tankers of 120,000 dwt and 477,000 dwt. The degree of effectiveness is estimated for both operational and accidental pollution based on the best available data. Sensitivity studies are provided to check general conclusions. Finally, estimates of the cost of preventing one cubic meter of oil pollution with each design are presented. A discussion of operating factors subject to a reduction in performance as a result of design features is given.


Sign in / Sign up

Export Citation Format

Share Document