Downtime Technique Using Artificial Intelligence: A Case Study for an Exposed Berthing Facility

Author(s):  
Ghassan El Chahal ◽  
Peter A. B. Morel ◽  
Sindhu Mole ◽  
Nadjib Saadali

Abstract Hydrodynamic modelling is significantly improved in the last decade, however, the coupling of these hydrodynamic models with methods to estimate berth downtime due to environmental conditions (wind, waves, currents) is less developed. The large number of environmental inputs (wind speed, wind direction, wave height, wave period, wave direction, current speed, current direction) and mooring outputs (vessel motions in six degrees of freedom, mooring lines, fender forces) to be considered in a downtime study requires major simplifications as the full wind-wave time series cannot be calculated. Downtime assessment is normally simplified by calculating a limited number of wind-wave combinations. A new approach is developed and presented in this paper by calculating the downtime for the long-term environmental time series using artificial neural networks. Neural networks are some of the most capable artificial intelligence tools for solving very complex problems like the present case. The approach has been applied for a bulk terminal project located in the Arabian Sea. The terminal has no shelter and is totally exposed to wind and swell waves. The wave climate for a 15 years period is established at the project site using spectral wave modelling software MIKE 21 SW. A large set of combined environmental conditions is selected for the dynamic mooring analysis of the vessel at berth. The time domain mooring analysis software MIKE 21 MA is used in this study. An inhouse Matlab code/program is developed using neural network to calculate the downtime for the long-term environmental time series based on the dynamic vessel response for the set of selected environmental combinations. This approach provides a more accurate downtime estimate which is important for the operability of such exposed facilities. The downtime tool is also tested for a different set of environmental combinations and mooring layouts in order to assess the sensitivity of these parameters on the downtime estimate. Up to the authors’ knowledge, this is the first published work applying artificial intelligence techniques for downtime studies.

2021 ◽  
Author(s):  
Felipe M. Moreno ◽  
Eduardo A. Tannuri

Abstract The methodology described in this paper is used to reduce a large set of combined wind, waves, and currents to a smaller set that still represents well enough the desired site for ship maneuvering simulations. This is achieved by running fast-time simulations for the entire set of environmental conditions and recording the vessel’s drifting time-series while it is controlled by an automatic-pilot based on a line-of-sight algorithm. The cases are then grouped considering how similar the vessel’s drifting time-series are, and one environmental condition is selected to represent each group found by the cluster analysis. The measurement of dissimilarity between the time-series is made by application of Dynamic Time Warping and the Cluster Analysis is made by the combination of Partitioning Around Medoids algorithm and the Silhouette Method. Validation is made by maneuvering simulations made with a Second Deck Officer.


Author(s):  
Ghassan El Chahal

Abstract Downtime related to excessive vessel motion and/or mooring line forces caused by environmental conditions is an important parameter for designing new terminals and ports in addition to planning offshore operations such as dredging, structure installations, etc. The present paper addresses an advanced approach using artificial intelligence in order to estimate the downtime in a more realistic manner. This approach is compared with conventional methods used in the industry and is applied for a number of terminal projects presented in this paper. Operations at marine terminals are generally protected by a breakwater. These breakwaters whether are rubble mound type or caissons are expensive as constructed in deep waters in 20 m or greater. The conventional downtime methods result generally in a higher value which subsequently requires a longer breakwater once the downtime criteria is exceeded. The advanced approach for estimating downtime helps to optimize the terminal/breakwater layout and subsequently save on the CAPEX. This approach estimates the downtime for the long term environmental time series using an inhouse Matlab code/program developed using neural network. In order to estimate the downtime, a set of specialized studies are conducted first illustrating the breadth and depth of port engineering. First, the wave climate for a long term time years is established at the project site using Spectral Wave Model MIKE 21 SW. The wave conditions are then transformed to the breakwater leeside (terminal side) using Boussinesq wave model MIKE 21 BW. Dynamic mooring study for the vessels at terminal is carried out using the time domain mooring analysis software MIKE 21 MA. Finally, an inhouse developed Matlab program calculates the downtime for the metocean time series based on the dynamic vessel response of the large set of selected environmental combinations. Up to the author’s knowledge, this is the first published work highlighting limitations of conventional methods and the importance of implementing advanced techniques. This leads to a new thinking of how terminals are to be designed in the future.


2016 ◽  
Vol 9 (1) ◽  
pp. 53-62 ◽  
Author(s):  
R. D. García ◽  
O. E. García ◽  
E. Cuevas ◽  
V. E. Cachorro ◽  
A. Barreto ◽  
...  

Abstract. This paper presents the reconstruction of a 73-year time series of the aerosol optical depth (AOD) at 500 nm at the subtropical high-mountain Izaña Atmospheric Observatory (IZO) located in Tenerife (Canary Islands, Spain). For this purpose, we have combined AOD estimates from artificial neural networks (ANNs) from 1941 to 2001 and AOD measurements directly obtained with a Precision Filter Radiometer (PFR) between 2003 and 2013. The analysis is limited to summer months (July–August–September), when the largest aerosol load is observed at IZO (Saharan mineral dust particles). The ANN AOD time series has been comprehensively validated against coincident AOD measurements performed with a solar spectrometer Mark-I (1984–2009) and AERONET (AErosol RObotic NETwork) CIMEL photometers (2004–2009) at IZO, obtaining a rather good agreement on a daily basis: Pearson coefficient, R, of 0.97 between AERONET and ANN AOD, and 0.93 between Mark-I and ANN AOD estimates. In addition, we have analysed the long-term consistency between ANN AOD time series and long-term meteorological records identifying Saharan mineral dust events at IZO (synoptical observations and local wind records). Both analyses provide consistent results, with correlations  >  85 %. Therefore, we can conclude that the reconstructed AOD time series captures well the AOD variations and dust-laden Saharan air mass outbreaks on short-term and long-term timescales and, thus, it is suitable to be used in climate analysis.


Author(s):  
Chris Reed

Using artificial intelligence (AI) technology to replace human decision-making will inevitably create new risks whose consequences are unforeseeable. This naturally leads to calls for regulation, but I argue that it is too early to attempt a general system of AI regulation. Instead, we should work incrementally within the existing legal and regulatory schemes which allocate responsibility, and therefore liability, to persons. Where AI clearly creates risks which current law and regulation cannot deal with adequately, then new regulation will be needed. But in most cases, the current system can work effectively if the producers of AI technology can provide sufficient transparency in explaining how AI decisions are made. Transparency ex post can often be achieved through retrospective analysis of the technology's operations, and will be sufficient if the main goal is to compensate victims of incorrect decisions. Ex ante transparency is more challenging, and can limit the use of some AI technologies such as neural networks. It should only be demanded by regulation where the AI presents risks to fundamental rights, or where society needs reassuring that the technology can safely be used. Masterly inactivity in regulation is likely to achieve a better long-term solution than a rush to regulate in ignorance. This article is part of a discussion meeting issue ‘The growing ubiquity of algorithms in society: implications, impacts and innovations'.


Author(s):  
Jessica A. F. Thompson

Much of the controversy evoked by the use of deep neural networks as models of biological neural systems amount to debates over what constitutes scientific progress in neuroscience. In order to discuss what constitutes scientific progress, one must have a goal in mind (progress towards what?). One such long term goal is to produce scientific explanations of intelligent capacities (e.g., object recognition, relational reasoning). I argue that the most pressing philosophical questions at the intersection of neuroscience and artificial intelligence are ultimately concerned with defining the phenomena to be explained and with what constitute valid explanations of such phenomena. I propose that a foundation in the philosophy of scientific explanation and understanding can scaffold future discussions about how an integrated science of intelligence might progress. Towards this vision, I review relevant theories of scientific explanation and discuss strategies for unifying the scientific goals of neuroscience and AI.


2015 ◽  
Vol 8 (9) ◽  
pp. 9075-9103 ◽  
Author(s):  
R. D. García ◽  
O. E. García ◽  
E. Cuevas ◽  
V. E. Cachorro ◽  
A. Barreto ◽  
...  

Abstract. This paper presents the reconstruction of the 73 year time series of the aerosol optical depth (AOD) at 500 nm at the subtropical high-mountain Izaña Atmospheric Observatory (IZO) located in Tenerife (Canary Islands, Spain). For this purpose, we have combined AOD estimates from artificial neural networks (ANNs) from 1941 to 2001 and AOD measurements directly obtained with a Precision Filter Radiometer (PFR) between 2003 and 2013. The analysis is limited to summer months (July–August–September), when the largest aerosol load is observed at IZO (Saharan mineral dust particles). The ANN AOD time series has been comprehensively validated against coincident AOD measurements performed with a solar spectrometer Mark-I (1984–2009) and AERONET (AErosol RObotic NETwork) CIMEL photometers (2004–2009) at IZO, obtaining a rather good agreement on a daily basis: Pearson coefficient, R, of 0.97 between AERONET and ANN AOD, and 0.93 between Mark-I and ANN AOD estimates. In addition, we have analyzed the long-term consistency between ANN AOD time series and long-term meteorological records identifying Saharan mineral dust events at IZO (synoptical observations and local wind records). Both analyses provide consistent results, with correlations larger than 85 %. Therefore, we can conclude the reconstructed AOD time series captures well the AOD variations and dust-laden Saharan air mass outbreaks at short-term and long-term time scales and, thus, it is suitable to be used in climate analysis.


2019 ◽  
Vol 61 ◽  
pp. 01030 ◽  
Author(s):  
Marek Vochozka ◽  
Jaromír Vrbka

The exchange rate is one of the most monitored economic variables, from the position of individual citizens or economists, financial institutions or entrepreneurs. In the long run, it is a reflection of the condition of the economy, and in the short and medium term it has a significant impact on the economy. The time series of currency development maps past developments, current status, and is also able to predict future developments. This article analyzes the time series of the development of EUR to Yuan exchange rate using artificial intelligence. It aims to evaluate this development and to indicate the prediction of the future development of EUR to Yuan.


Author(s):  
Christian Hillbrand

The motivation for this chapter is the observation that many companies build their strategy upon poorly validated hypotheses about cause and effect of certain business variables. However, the soundness of these cause-and-effect-relations as well as the knowledge of the approximate shape of the functional dependencies underlying these associations turns out to be the biggest issue for the quality of the results of decision supporting procedures. Since it is sufficiently clear that mere correlation of time series is not suitable to prove the causality of two business concepts, there seems to be a rather dogmatic perception of the inadmissibility of empirical validation mechanisms for causal models within the field of strategic management as well as management science. However, one can find proven causality techniques in other sciences like econometrics, mechanics, neuroscience, or philosophy. Therefore this chapter presents an approach which applies a combination of well-established statistical causal proofing methods to strategy models in order to validate them. These validated causal strategy models are then used as the basis for approximating the functional form of causal dependencies by the means of Artificial Neural Networks. This in turn can be employed to build an approximate simulation or forecasting model of the strategic system.


Sign in / Sign up

Export Citation Format

Share Document