Real-Time Tsunami Prediction System Based on Seafloor Observatory Data Applied to the Inland Sea, Japan

2018 ◽  
Vol 52 (3) ◽  
pp. 120-127 ◽  
Author(s):  
Narumi Takahashi ◽  
Kentaro Imai ◽  
Kentaro Sueki ◽  
Ryoko Obayashi ◽  
Masanobu Ishibashi ◽  
...  

AbstractThe damage and loss of life caused by tsunamis can be reduced by timely warnings, which predict the arrival time and maximum height of tsunamis, to support evacuations and other mitigating actions. We have developed a real-time tsunami prediction system based on data from the Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET) that has been implemented in some local governments along the Pacific coast of Japan. The system generates estimates of tsunami arrival times and the height, inundation areas, and worst case using selected fault rupture models. The main objective of this paper is to show the possibility of applying the above system for a complicated topography area, and we report a successful application of the system in Sakaide, a city on the Shikoku coast of the Inland Sea, using a simulated great plate-boundary earthquake in the Nankai Trough. The simulated tsunami propagates to Sakaide by complicated routes between several islands. According to calculated tsunami waveforms of 1,506 cases, waveforms of tsunamis propagating to the Inland Sea have a relatively uniform frequency, regardless of the magnitude of the causative event, after running through the narrow straits in the Inland Sea. At the same time, waves are amplified as they pass between the islands of Shodoshima and Shikoku by an interaction with reflected waves. These effects are compatible with this prediction system, and we confirmed that our predicted tsunami is consistent with the final result from a model of a magnitude 9 Nankai Trough earthquake.

Geosciences ◽  
2019 ◽  
Vol 9 (2) ◽  
pp. 102
Author(s):  
Tomohiro Kubo ◽  
Wataru Suzuki ◽  
Masahiro Ooi ◽  
Narumi Takahashi ◽  
Kazumi Asao ◽  
...  

We applied a real-time tsunami inundation forecast system to a disaster response plan. We developed a standard operating procedure (SOP) for a tsunami disaster response based on a Plan, Do, Check, Action cycle to effectively use tsunami observation and prediction information provided by a real-time tsunami inundation forecast system during an initial response to a tsunami disaster. In the Plan stage, we ran a workshop on the tsunami disaster response to confirm the current tsunami disaster response plan and develop a timeline plan for a tsunami disaster. In the Do stage, we conducted a tabletop exercise (TTX) for a tsunami disaster using a real-time tsunami prediction system. In the Check stage, we ran a workshop on an after-action review of the TTX. In the Action stage, we applied the SOPs of the real-time tsunami prediction system to the tsunami disaster management plan and conducted a second TTX. As a result, we verified the information provided by a real-time tsunami prediction system to apply the system to a tsunami disaster management plan for real municipalities. It was confirmed that the SOP that we developed allows a real-time tsunami inundation forecast system to enable government staff to safely and effectively respond during a disaster.


2020 ◽  
Author(s):  
Adam Wspanialy ◽  
Sean Toczko ◽  
Nobu Eguchi ◽  
Lena Maeda ◽  
Kan Aoike ◽  
...  

<p>IODP Expedition 358 planned to access and sample the subducting plate boundary at the Nankai Trough, Japan, and commenced on 7 October 2018, and ended on 31 March 2019, marking the ultimate stage of the NanTroSEIZE project. The goal was to drill down to the plate boundary fault, about 5 km below the ocean floor, where >8M earthquakes occur regularly at every 100–150 years. The successful completion would have represented the deepest borehole in the history of scientific ocean drilling and ultimately greatly deepen our understanding about fault mechanics, earthquake inception and tsunami generation processes.</p><p>The IODP Expedition 358 intended to access the plate boundary fault zone system through deepening the previously drilled and suspended C0002P hole. The original operational objective of the Exp 358 was to reach a total depth of 7267.5 mbrt (+/- 5200 mbsf) in 4 drilled sections. Previous major riser drilling efforts during the IODP Expeditions 338 and 348 advanced the main riser hole at Site C0002 (Hole C0002F/N/P) to 3058.5 mbsf meters below sea floor (mbsf). Extensive downhole logging data and limited intervals of core were collected during those expeditions.</p><p>Due to the nature of the drilling operation and the anticipated challenges ahead, JAMSTEC adopted oil & gas industry drilling standards and performed two detailed Drilling Well on Paper (DWOP) workshops as part of the very rigorous preparatory stage. Great deal of time was spent on selecting new and state-of-the-art drilling/circulating techniques, logging tools, bits and drilling fluid formulation including a new mud sealant additive “FracSeal” to make sure borehole integrity issues can be minimized as much as possible. Drilling stages seen implementation of a novel concept of near real-time geomechanics to continuously monitor and assess borehole integrity.</p><p>The challenges born from side-tracking near the bottom of the previously drilled Hole C0002P (2014 Exp. 348), proved greater than the multi-disciplinary teams expected and the overall objectives set for Exp.358 were not achieved. Nevertheless, despite the significant problems seen during several attempts, the hole was deepened 204 m. This is a minor success and it is believed, once away from the highly damaged area of the C0002P hole, drilling can produce a high-integrity hole following excellent communication and recommendations between drilling and scientific teams during complex drilling operations, especially in complex environments such as the Nankai Accretionary Prism.</p><p>Despite not achieving the ultimate goal of the expedition, the implemented industry drilling standards, real-time surveillance system, real time geomechanics, improved and strict communication protocols, and integrating both scientific and drilling teams have demonstrated their value and should become standard practice during future IODP/ICDP operations.</p>


Geosciences ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 226 ◽  
Author(s):  
Daniel Giles ◽  
Brian McConnell ◽  
Frédéric Dias

Tsunamis are infrequent events that have the potential to be extremely destructive. The last major tsunami to effect the Irish coastline was the Lisbon 1755 event. That event acts as a candidate worst case scenario for hazard assessment and the impacts on the Irish Coastline are presented here. As there is no general consensus on the 1755 earthquake source, multiple sources highlighted in the literature are investigated. These sources are used to generate the initial conditions and the resultant tsunami waves are simulated with the massively parallelised Volna-OP2 finite volume tsunami code. The hazard associated with the event is captured on three gradated levels. A reduced faster than real time tsunami ensemble is produced for the North-East Atlantic on a regional level in 93 s using two Nvidia V100 GPUs. By identifying the most vulnerable sections of the Irish coastline from this regional forecast, some locally refined simulations are further carried out in a faster than real time setting. As arrival times on the coastline can be on the O (mins), these faster than real time reduced ensembles are of great benefit for tsunami warning. Volna-OP2’s capabilities in this respect are clearly demonstrated here. Finally, high resolution inundation simulations, which build upon the ensemble results, are carried out. To date this study provides the best estimate of assessing the hazard associated with a Lisbon-type tsunami event for the Irish coastline. The results of the inundation mapping highlight that along the vulnerable sections of coastline, inundation is constrained to low-lying areas with maximum run-up heights of 3.4 m being found.


2016 ◽  
Vol 50 (3) ◽  
pp. 87-91 ◽  
Author(s):  
Morifumi Takaesu ◽  
Hiroki Horikawa ◽  
Kentaro Sueki ◽  
Narumi Takahashi ◽  
Akira Sonoda ◽  
...  

AbstractMega-thrust earthquakes are anticipated to occur in the Nankai Trough in Southwest Japan. In order to monitor seismicity, crustal deformations, and tsunamis in earthquake source areas, we deployed the seafloor seismic network DONET (Dense Ocean-floor Network System for Earthquakes and Tsunamis) in 2010 (Kaneda et al., 2015; Kawaguchi et al., 2015). The DONET system consists of a total of 20 stations that are composed of multiple types of sensors, including strong-motion seismometers and quartz pressure gauges. These stations are densely distributed at an average distance of 15‐20 km and cover from near the trench axis to coastal areas. Observed data are transferred to a land station through a fiber-optic cable and then to the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) data management center through a private network in real time.After the 2011 earthquake off the Pacific coast of Tohoku, each local government close to the Nankai Trough sought to devise a disaster prevention scheme. These local governments requested that JAMSTEC disseminate the DONET data along with other research capabilities so that they could exploit this important earthquake information. In order to provide local government access to the DONET data, which are recorded ostensibly for research purposes, we have developed a web application system, REIS (real-time earthquake information system), that provides seismic waveform data to some local governments close to the Nankai Trough. In the present paper, we introduce the specifications of REIS and its system architecture.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Sherif M. Hanafy ◽  
Hussein Hoteit ◽  
Jing Li ◽  
Gerard T. Schuster

AbstractResults are presented for real-time seismic imaging of subsurface fluid flow by parsimonious refraction and surface-wave interferometry. Each subsurface velocity image inverted from time-lapse seismic data only requires several minutes of recording time, which is less than the time-scale of the fluid-induced changes in the rock properties. In this sense this is real-time imaging. The images are P-velocity tomograms inverted from the first-arrival times and the S-velocity tomograms inverted from dispersion curves. Compared to conventional seismic imaging, parsimonious interferometry reduces the recording time and increases the temporal resolution of time-lapse seismic images by more than an order-of-magnitude. In our seismic experiment, we recorded 90 sparse data sets over 4.5 h while injecting 12-tons of water into a sand dune. Results show that the percolation of water is mostly along layered boundaries down to a depth of a few meters, which is consistent with our 3D computational fluid flow simulations and laboratory experiments. The significance of parsimonious interferometry is that it provides more than an order-of-magnitude increase of temporal resolution in time-lapse seismic imaging. We believe that real-time seismic imaging will have important applications for non-destructive characterization in environmental, biomedical, and subsurface imaging.


Author(s):  
D Spallarossa ◽  
M Cattaneo ◽  
D Scafidi ◽  
M Michele ◽  
L Chiaraluce ◽  
...  

Summary The 2016–17 central Italy earthquake sequence began with the first mainshock near the town of Amatrice on August 24 (MW 6.0), and was followed by two subsequent large events near Visso on October 26 (MW 5.9) and Norcia on October 30 (MW 6.5), plus a cluster of 4 events with MW > 5.0 within few hours on January 18, 2017. The affected area had been monitored before the sequence started by the permanent Italian National Seismic Network (RSNC), and was enhanced during the sequence by temporary stations deployed by the National Institute of Geophysics and Volcanology and the British Geological Survey. By the middle of September, there was a dense network of 155 stations, with a mean separation in the epicentral area of 6–10 km, comparable to the most likely earthquake depth range in the region. This network configuration was kept stable for an entire year, producing 2.5 TB of continuous waveform recordings. Here we describe how this data was used to develop a large and comprehensive earthquake catalogue using the Complete Automatic Seismic Processor (CASP) procedure. This procedure detected more than 450,000 events in the year following the first mainshock, and determined their phase arrival times through an advanced picker engine (RSNI-Picker2), producing a set of about 7 million P- and 10 million S-wave arrival times. These were then used to locate the events using a non-linear location (NLL) algorithm, a 1D velocity model calibrated for the area, and station corrections and then to compute their local magnitudes (ML). The procedure was validated by comparison of the derived data for phase picks and earthquake parameters with a handpicked reference catalogue (hereinafter referred to as ‘RefCat’). The automated procedure takes less than 12 hours on an Intel Core-i7 workstation to analyse the primary waveform data and to detect and locate 3000 events on the most seismically active day of the sequence. This proves the concept that the CASP algorithm can provide effectively real-time data for input into daily operational earthquake forecasts, The results show that there have been significant improvements compared to RefCat obtained in the same period using manual phase picks. The number of detected and located events is higher (from 84,401 to 450,000), the magnitude of completeness is lower (from ML 1.4 to 0.6), and also the number of phase picks is greater with an average number of 72 picked arrival for a ML = 1.4 compared with 30 phases for RefCat using manual phase picking. These propagate into formal uncertainties of ± 0.9km in epicentral location and ± 1.5km in depth for the enhanced catalogue for the vast majority of the events. Together, these provide a significant improvement in the resolution of fine structures such as local planar structures and clusters, in particular the identification of shallow events occurring in parts of the crust previously thought to be inactive. The lower completeness magnitude provides a rich data set for development and testing of analysis techniques of seismic sequences evolution, including real-time, operational monitoring of b-value, time-dependent hazard evaluation and aftershock forecasting.


2021 ◽  
Vol 11 (9) ◽  
pp. 3896
Author(s):  
Khaled M. Shalghum ◽  
Nor Kamariah Noordin ◽  
Aduwati Sali ◽  
Fazirulhisyam Hashim

Deterministic latency is an urgent demand to pursue the continuous increase in intelligence in several real-time applications, such as connected vehicles and automation industries. A time-sensitive network (TSN) is a new framework introduced to serve these applications. Several functions are defined in the TSN standard to support time-triggered (TT) requirements, such as IEEE 802.1Qbv and IEEE 802.1Qbu for traffic scheduling and preemption mechanisms, respectively. However, implementing strict timing constraints to support scheduled traffic can miss the needs of unscheduled real-time flows. Accordingly, more relaxed scheduling algorithms are required. In this paper, we introduce the flexible window-overlapping scheduling (FWOS) algorithm that optimizes the overlapping among TT windows by three different metrics: the priority of overlapping, the position of overlapping, and the overlapping ratio (OR). An analytical model for the worst-case end-to-end delay (WCD) is derived using the network calculus (NC) approach considering the relative relationships between window offsets for consecutive nodes and evaluated under a realistic vehicle use case. While guaranteeing latency deadline for TT traffic, the FWOS algorithm defines the maximum allowable OR that maximizes the bandwidth available for unscheduled transmission. Even under a non-overlapping scenario, less pessimistic latency bounds have been obtained using FWOS than the latest related works.


2009 ◽  
Vol 24 (3) ◽  
pp. 812-828 ◽  
Author(s):  
Young-Mi Min ◽  
Vladimir N. Kryjov ◽  
Chung-Kyu Park

Abstract A probabilistic multimodel ensemble prediction system (PMME) has been developed to provide operational seasonal forecasts at the Asia–Pacific Economic Cooperation (APEC) Climate Center (APCC). This system is based on an uncalibrated multimodel ensemble, with model weights inversely proportional to the errors in forecast probability associated with the model sampling errors, and a parametric Gaussian fitting method for the estimate of tercile-based categorical probabilities. It is shown that the suggested method is the most appropriate for use in an operational global prediction system that combines a large number of models, with individual model ensembles essentially differing in size and model weights in the forecast and hindcast datasets being inconsistent. Justification for the use of a Gaussian approximation of the precipitation probability distribution function for global forecasts is also provided. PMME retrospective and real-time forecasts are assessed. For above normal and below normal categories, temperature forecasts outperform climatology for a large part of the globe. Precipitation forecasts are definitely more skillful than random guessing for the extratropics and climatological forecasts for the tropics. The skill of real-time forecasts lies within the range of the interannual variability of the historical forecasts.


Sign in / Sign up

Export Citation Format

Share Document