Decommissioning Offshore Structures by Extraction of Foundation Mono piles Applying a Vibratory Hammer

2021 ◽  
Author(s):  
Rob van Dorp ◽  
Peter Middendorp ◽  
Marcel Bielefeld ◽  
Gerald Verbeek

Abstract The vibratory hammer is one of the tools for the extraction of offshore foundation piles as well as monopiles for the decommissioning of offshore structures. In addition to the standard application, where a pile is driven downward to be installed, a vibratory hammer can also be applied to extract piles. For an efficient and commercially attractive application of vibratory hammers for this purpose, the extraction process needs to be modeled during the planning phase to ensure that the appropriate equipment is used. This paper describes how pile driving simulation software can be used to model the extraction process. This is further illustrated through a case study covering the extraction phase of the 1st (onshore) and 2nd (offshore) part of the Delft Offshore Turbine Project. A monopile with a diameter of 4.0 m was extracted approximate 6 months after installation onshore and then extracted several times offshore shortly after installation in the 2nd phase. The paper will not only present the actual extraction predictions, but also the monitoring data obtained during extraction and the results of the post-analysis.

TAPPI Journal ◽  
2012 ◽  
Vol 11 (8) ◽  
pp. 17-24 ◽  
Author(s):  
HAKIM GHEZZAZ ◽  
LUC PELLETIER ◽  
PAUL R. STUART

The evaluation and process risk assessment of (a) lignin precipitation from black liquor, and (b) the near-neutral hemicellulose pre-extraction for recovery boiler debottlenecking in an existing pulp mill is presented in Part I of this paper, which was published in the July 2012 issue of TAPPI Journal. In Part II, the economic assessment of the two biorefinery process options is presented and interpreted. A mill process model was developed using WinGEMS software and used for calculating the mass and energy balances. Investment costs, operating costs, and profitability of the two biorefinery options have been calculated using standard cost estimation methods. The results show that the two biorefinery options are profitable for the case study mill and effective at process debottlenecking. The after-tax internal rate of return (IRR) of the lignin precipitation process option was estimated to be 95%, while that of the hemicellulose pre-extraction process option was 28%. Sensitivity analysis showed that the after tax-IRR of the lignin precipitation process remains higher than that of the hemicellulose pre-extraction process option, for all changes in the selected sensitivity parameters. If we consider the after-tax IRR, as well as capital cost, as selection criteria, the results show that for the case study mill, the lignin precipitation process is more promising than the near-neutral hemicellulose pre-extraction process. However, the comparison between the two biorefinery options should include long-term evaluation criteria. The potential of high value-added products that could be produced from lignin in the case of the lignin precipitation process, or from ethanol and acetic acid in the case of the hemicellulose pre-extraction process, should also be considered in the selection of the most promising process option.


2021 ◽  
Vol 11 (13) ◽  
pp. 5826
Author(s):  
Evangelos Axiotis ◽  
Andreas Kontogiannis ◽  
Eleftherios Kalpoutzakis ◽  
George Giannakopoulos

Ethnopharmacology experts face several challenges when identifying and retrieving documents and resources related to their scientific focus. The volume of sources that need to be monitored, the variety of formats utilized, and the different quality of language use across sources present some of what we call “big data” challenges in the analysis of this data. This study aims to understand if and how experts can be supported effectively through intelligent tools in the task of ethnopharmacological literature research. To this end, we utilize a real case study of ethnopharmacology research aimed at the southern Balkans and the coastal zone of Asia Minor. Thus, we propose a methodology for more efficient research in ethnopharmacology. Our work follows an “expert–apprentice” paradigm in an automatic URL extraction process, through crawling, where the apprentice is a machine learning (ML) algorithm, utilizing a combination of active learning (AL) and reinforcement learning (RL), and the expert is the human researcher. ML-powered research improved the effectiveness and efficiency of the domain expert by 3.1 and 5.14 times, respectively, fetching a total number of 420 relevant ethnopharmacological documents in only 7 h versus an estimated 36 h of human-expert effort. Therefore, utilizing artificial intelligence (AI) tools to support the researcher can boost the efficiency and effectiveness of the identification and retrieval of appropriate documents.


2020 ◽  
Vol 12 (24) ◽  
pp. 10686
Author(s):  
Mona Abouhamad ◽  
Metwally Abu-Hamd

The objective of this paper is to apply the life cycle assessment methodology to assess the environmental impacts of light steel framed buildings fabricated from cold formed steel (CFS) sections. The assessment covers all phases over the life span of the building from material production, construction, use, and the end of building life, in addition to loads and benefits from reuse/recycling after building disposal. The life cycle inventory and environmental impact indicators are estimated using the Athena Impact Estimator for Buildings. The input data related to the building materials used are extracted from a building information model of the building while the operating energy in the use phase is calculated using an energy simulation software. The Athena Impact Estimator calculates the following mid-point environmental measures: global warming potential (GWP), acidification potential, human health potential, ozone depletion potential, smog potential, eutrophication potential, primary and non-renewable energy (PE) consumption, and fossil fuel consumption. The LCA assessment was applied to a case study of a university building. Results of the case study related to GWP and PE were as follows. The building foundations were responsible for 29% of the embodied GWP and 20% of the embodied PE, while the CFS skeleton was responsible for 30% of the embodied GWP and 49% of the embodied PE. The production stage was responsible for 90% of the embodied GWP and embodied PE. When benefits associated with recycling/reuse were included in the analysis according to Module D of EN 15978, the embodied GWP was reduced by 15.4% while the embodied PE was reduced by 6.22%. Compared with conventional construction systems, the CFS framing systems had much lower embodied GWP and PE.


2011 ◽  
Vol 421 ◽  
pp. 250-253
Author(s):  
Hu Zhu ◽  
Xiao Guang Yang

To lay the foundation of the automation for line heating forming, a method for heating path generation and simulation for ship plate steel base on STL mode was proposed in this paper. The line heating path was generated by slicing the STL model of the steel plate using a series of planes, and the models of the heating equipment of ship plate steel were build, and the heating process of ship plate steel can be simulated by inputting the models of the heating equipment into the simulation system that was built by using VC++ and OpenGL. The case study shows that the method can primely solve the inconvenient of manual heating and the whole heating process can be observed by the simulation so that the heating process can be made a reasonable monitoring, and the heating path generation and simulation software are runs stably and reliably.


2018 ◽  
Vol 204 ◽  
pp. 02009 ◽  
Author(s):  
Dani Yuniawan ◽  
P.P Aang Fajar ◽  
Samsudin Hariyanto ◽  
Romi Setiawan

Currently Mergan 4-way intersection is one of intersection that have most traffic dense in Malang City, East Java - Indonesia. This research implement simulation method in order to give several solution option to manage the traffic queue in Mergan 4-way intersection. Simulation method is conducted with several phase, from problem identification up to verification and validation also scenario simulation. Arena Simulation software v.14 is chosen as the tool to modeling the traffic queue line. The research outcome give several solution through Traffic Light 2 simulation scenario. With this simulation scenario, the traffic flow system simulation can be run with fewer queues of vehicles.


2018 ◽  
Vol 203 ◽  
pp. 03005
Author(s):  
Idzham Fauzi Mohd Ariff ◽  
Mardhiyah Bakir

A dynamic simulation model was developed, calibrated and validated for a petrochemical plant in Terengganu, Malaysia. Calibration and validation of the model was conducted based on plant monitoring data spanning 3 years resulting in a model accuracy (RMSD) for effluent chemical oxygen demand (COD), ammoniacal nitrogen (NH3-N) and total suspended solids (TSS) of ±11.7 mg/L, ±0.52 mg/L and ± 3.27 mg/L respectively. The simulation model has since been used for troubleshooting during plant upsets, planning of plant turnarounds and developing upgrade options. A case study is presented where the simulation model was used to assist in troubleshooting and rectification of a plant upset where ingress of a surfactant compound resulted in high effluent TSS and COD. The model was successfully used in the incident troubleshooting activities and provided critical insights that assisted the plant operators to quickly respond and bring back the system to normal, stable condition.


Author(s):  
Paula T. Nascimento ◽  
Marco A. P. Rosas ◽  
Leonardo Brandão ◽  
Fernando Castanheira

The present study compares the progressive collapse approach with the traditional temperature screening method on determination of PFP requirements at topside offshore structures. The advantage to evaluate the consequences of fire scenarios on the global integrity and stability of topside modules can be revealed by a substantial reduction of the required amount of PFP, and consequently significant cost savings for operators, when compared to the traditional approach. In the case study presented in this paper, there is a reduction of 79% in PFP allocation.


Author(s):  
Vanessa Tobias ◽  

In fisheries monitoring, catch is assumed to be a product of fishing intensity, catchability, and availability, where availability is defined as the number or biomass of fish present and catchability refers to the relationship between catch rate and the true population. Ecological monitoring programs use catch per unit of effort (CPUE) to standardize catch and monitor changes in fish populations; however, CPUE is proportional to the portion of the population that is vulnerable to the type of gear used in sampling, which is not necessarily the entire population. Programs often deal with this problem by assuming that catchability is constant, but if catchability is not constant, it is not possible to separate the effects of catchability and population size using monitoring data alone. This study uses individual-based simulation to separate the effects of changing environmental conditions on catchability and availability in environmental monitoring data. The simulation combines a module for sampling conditions with a module for individual fish behavior to estimate the proportion of available fish that would escape from the sample. The method is applied to the case study of the well monitored fish species Delta Smelt (Hypomesus transpacificus) in the San Francisco Estuary, where it has been hypothesized that changing water clarity may affect catchability for long-term monitoring studies. Results of this study indicate that given constraints on Delta Smelt swimming ability, it is unlikely that the apparent declines in Delta Smelt abundance are the result of changing water clarity affecting catchability.


Author(s):  
Erma Susanti ◽  
Khabib Mustofa

AbstrakEkstraksi  informasi  merupakan suatu bidang ilmu untuk pengolahan bahasa alami, dengan cara mengubah teks tidak terstruktur menjadi informasi dalam bentuk terstruktur. Berbagai jenis informasi di Internet ditransmisikan secara tidak terstruktur melalui website, menyebabkan munculnya kebutuhan akan suatu teknologi untuk menganalisa teks dan menemukan pengetahuan yang relevan dalam bentuk informasi terstruktur. Contoh informasi tidak terstruktur adalah informasi utama yang ada pada konten halaman web. Bermacam pendekatan untuk ekstraksi informasi telah dikembangkan oleh berbagai peneliti, baik menggunakan metode manual atau otomatis, namun masih perlu ditingkatkan kinerjanya terkait akurasi dan kecepatan ekstraksi. Pada penelitian ini diusulkan suatu penerapan pendekatan ekstraksi informasi dengan mengkombinasikan pendekatan bootstrapping dengan Ontology-based Information Extraction (OBIE). Pendekatan bootstrapping dengan menggunakan sedikit contoh data berlabel, digunakan untuk memimalkan keterlibatan manusia dalam proses ekstraksi informasi, sedangkan penggunakan panduan ontologi untuk mengekstraksi classes (kelas), properties dan instance digunakan untuk menyediakan konten semantik untuk web semantik. Pengkombinasian kedua pendekatan tersebut diharapkan dapat meningkatan kecepatan proses ekstraksi dan akurasi hasil ekstraksi. Studi kasus untuk penerapan sistem ekstraksi informasi menggunakan dataset “LonelyPlanet”. Kata kunci—Ekstraksi informasi, ontologi, bootstrapping, Ontology-Based Information Extraction, OBIE, kinerja Abstract Information extraction is a field study of natural language processing by converting unstructured text into structured information. Several types of information on the Internet is transmitted through unstructured information via websites, led to emergence of the need a technology to analyze text and found relevant knowledge into structured information. For example of unstructured information is existing main information on the content of web pages. Various approaches  for information extraction have been developed by many researchers, either using manual or automatic method, but still need to be improved performance related accuracy and speed of extraction. This research proposed an approach of information extraction that combines bootstrapping approach with Ontology-Based Information Extraction (OBIE). Bootstrapping approach using small seed of labelled data, is used to minimize human intervention on information extraction process, while the use of guide ontology for extracting classes, properties and instances, using for provide semantic content for semantic web. Combining both approaches expected to increase speed of extraction process and accuracy of extraction results. Case study to apply information extraction system using “LonelyPlanet” datasets. Keywords— Information extraction, ontology, bootstrapping, Ontology-Based Information Extraction, OBIE, performance


2019 ◽  
Vol 26 (4) ◽  
pp. 39-46 ◽  
Author(s):  
Ozgur Ozguc

Abstract Offshore structures are exposed to the risk of damage caused by various types of extreme and accidental events, such as fire, explosion, collision, and dropped objects. These events cause structural damage in the impact area, including yielding of materials, local buckling, and in some cases local failure and penetration. The structural response of an FPSO hull subjected to events involving dropped objects is investigated in this study, and non-linear finite element analyses are carried out using an explicit dynamic code written LS-DYNA software. The scenarios involving dropped objects are based on the impact from the fall of a container and rigid mechanical equipment. Impact analyses of the dropped objects demonstrated that even though some structural members were permanently deformed by drop loads, no failure took place in accordance with the plastic strain criteria, as per NORSOK standards. The findings and insights derived from the present study may be informative in the safe design of floating offshore structures.


Sign in / Sign up

Export Citation Format

Share Document