Capturing, Preserving, and Digitizing Legacy Seismic Data from the Montserrat Volcano Observatory Analog Seismic Network, July 1995–December 2004

2020 ◽  
Vol 91 (4) ◽  
pp. 2127-2140 ◽  
Author(s):  
Glenn Thompson ◽  
John A. Power ◽  
Jochen Braunmiller ◽  
Andrew B. Lockhart ◽  
Lloyd Lynch ◽  
...  

Abstract An eruption of the Soufrière Hills Volcano (SHV) on the eastern Caribbean island of Montserrat began on 18 July 1995 and continued until February 2010. Within nine days of the eruption onset, an existing four-station analog seismic network (ASN) was expanded to 10 sites. Telemetered data from this network were recorded, processed, and archived locally using a system developed by scientists from the U.S. Geological Survey (USGS) Volcano Disaster Assistance Program (VDAP). In October 1996, a digital seismic network (DSN) was deployed with the ability to capture larger amplitude signals across a broader frequency range. These two networks operated in parallel until December 2004, with separate telemetry and acquisition systems (analysis systems were merged in March 2001). Although the DSN provided better quality data for research, the ASN featured superior real-time monitoring tools and captured valuable data including the only seismic data from the first 15 months of the eruption. These successes of the ASN have been rather overlooked. This article documents the evolution of the ASN, the VDAP system, the original data captured, and the recovery and conversion of more than 230,000 seismic events from legacy SUDS, Hypo71, and Seislog formats into Seisan database with waveform data in miniSEED format. No digital catalog existed for these events, but students at the University of South Florida have classified two-thirds of the 40,000 events that were captured between July 1995 and October 1996. Locations and magnitudes were recovered for ∼10,000 of these events. Real-time seismic amplitude measurement, seismic spectral amplitude measurement, and tiltmeter data were also captured. The result is that the ASN seismic dataset is now more discoverable, accessible, and reusable, in accordance with FAIR data principles. These efforts could catalyze new research on the 1995–2010 SHV eruption. Furthermore, many observatories have data in these same legacy data formats and might benefit from procedures and codes documented here.

2021 ◽  
Vol 11 (11) ◽  
pp. 4874
Author(s):  
Milan Brankovic ◽  
Eduardo Gildin ◽  
Richard L. Gibson ◽  
Mark E. Everett

Seismic data provides integral information in geophysical exploration, for locating hydrocarbon rich areas as well as for fracture monitoring during well stimulation. Because of its high frequency acquisition rate and dense spatial sampling, distributed acoustic sensing (DAS) has seen increasing application in microseimic monitoring. Given large volumes of data to be analyzed in real-time and impractical memory and storage requirements, fast compression and accurate interpretation methods are necessary for real-time monitoring campaigns using DAS. In response to the developments in data acquisition, we have created shifted-matrix decomposition (SMD) to compress seismic data by storing it into pairs of singular vectors coupled with shift vectors. This is achieved by shifting the columns of a matrix of seismic data before applying singular value decomposition (SVD) to it to extract a pair of singular vectors. The purpose of SMD is data denoising as well as compression, as reconstructing seismic data from its compressed form creates a denoised version of the original data. By analyzing the data in its compressed form, we can also run signal detection and velocity estimation analysis. Therefore, the developed algorithm can simultaneously compress and denoise seismic data while also analyzing compressed data to estimate signal presence and wave velocities. To show its efficiency, we compare SMD to local SVD and structure-oriented SVD, which are similar SVD-based methods used only for denoising seismic data. While the development of SMD is motivated by the increasing use of DAS, SMD can be applied to any seismic data obtained from a large number of receivers. For example, here we present initial applications of SMD to readily available marine seismic data.


2020 ◽  
Author(s):  
Masayoshi Ichiyanagi ◽  
Mikhaylov Valentin ◽  
Dmitry Kostylev ◽  
Yuri Levin ◽  
Hiroaki Takahashi

Abstract The southwestern Kuril trench is seismically active due to the subduction of the Pacific plate. Great earthquakes in this zone have frequently induced fatal disasters. Seismic monitoring and hypocenter catalogues provide fundamental information on earthquake occurrence and disaster mitigation. Real-time hypocenter and magnitude estimates are extremely crucial data for tsunami warning systems. However, this region is located in the international border zone between Japan and Russia. The Japan Meteorological Agency and Russian Academy of Sciences have routinely determined hypocenters and issued earthquake information independently. Waveform data have not yet been exchanged internationally in real time. Here, we evaluated how a hypothetical Japan-Russia joint seismic network could potentially improve the hypocenter estimation accuracy. Experiments using numerical and observed data indicated that the joint network extended the distance over which hypocenters can be accurately determined over 100 km eastward compared to the Japan network only. This fact suggests that joint seismic data have the potential to improve the hypocenter accuracy in this region, which would provide improved performance in gathering disaster information at the moment of a tsunami warning.


Author(s):  
Jayne M. Bormann ◽  
Emily A. Morton ◽  
Kenneth D. Smith ◽  
Graham M. Kent ◽  
William S. Honjas ◽  
...  

Abstract The Nevada Seismological Laboratory (NSL) at the University of Nevada, Reno, installed eight temporary seismic stations following the 15 May 2020 Mww 6.5 Monte Cristo Range earthquake. The mainshock and resulting aftershock sequence occurred in an unpopulated and sparsely instrumented region of the Mina deflection in the central Walker Lane, approximately 55 km west of Tonopah, Nevada. The temporary stations supplement NSL’s permanent seismic network, providing azimuthal coverage and near-field recording of the aftershock sequence beginning 1–3 days after the mainshock. We expect the deployment to remain in the field until May 2021. NSL initially attempted to acquire the Monte Cristo Range deployment data in real time via cellular telemetry; however, unreliable cellular coverage forced NSL to convert to microwave telemetry within the first week of the sequence to achieve continuous real-time acquisition. Through 31 August 2020, the temporary deployment has captured near-field records of three aftershocks ML≥5 and 25 ML 4–4.9 events. Here, we present details regarding the Monte Cristo Range deployment, instrumentation, and waveform availability. We combine this information with waveform availability and data access details from NSL’s permanent seismic network and partner regional seismic networks to create a comprehensive summary of Monte Cristo Range sequence data. NSL’s Monte Cristo Range temporary and permanent station waveform data are available in near-real time via the Incorporated Research Institutions for Seismology Data Management Center. Derived earthquake products, including NSL’s earthquake catalog and phase picks, are available via the Advanced National Seismic System Comprehensive Earthquake Catalog. The temporary deployment improved catalog completeness and location quality for the Monte Cristo Range sequence. We expect these data to be useful for continued study of the Monte Cristo Range sequence and constraining crustal and seismogenic properties of the Mina deflection and central Walker Lane.


2021 ◽  
Author(s):  
Nikolaos Triantafyllis ◽  
Ioannis Venetis ◽  
Ioannis Fountoulakis ◽  
Erion-Vasilis Pikoulis ◽  
Efthimios Sokos ◽  
...  

<p>Automatic Moment Tensor (MT) determination for regional areas is essential for real-time seismological applications such as stress inversion, shakemap generation, and tsunami warning. In recent years, the combination of two powerful tools, SeisComP and ISOLA (Sokos and Zahradník, 2008), paved the way for the release of Scisola (Triantafyllis et al., 2016), an open-source Python-based software for automatic MT calculation of seismic events provided by SeisComP -a well-known software package-, in real-time. ISOLA is an extensively used manual MT retrieval utility, based on multiple-point source representation and iterative deconvolution, full wavefield is taken into consideration and Green's functions are calculated with the discrete wavenumber method as implemented in the Axitra Fortran code (Cotton and Coutant, 1997). MT of subevents is found by least-square minimization of misfit between observed and synthetic waveforms, while position and time of subevents is optimized through grid search. Scisola monitors SeisComP and passes the event information, the respective waveform data and the station information to the ISOLA software for the Green’s Functions and MT computation. Gisola is a highly evolved version of Scisola software, oriented for High-Performance Computing. Unlike Scisola, the new program applies enhanced algorithms for waveform data filtering via quality metrics such as signal-to-noise ratio, waveform clipping, data and meta-data inconsistency, long-period (“mouse”) disturbances, and current station evaluation based on comparison between its daily Power Spectral Density (PSD) and various reference metrics for the frequency bands of interest, featuring a CPU multiprocessing implementation for faster calculations. Concerning the Green’s Functions computation, Gisola operates a newer version of the Axitra, highlighting the power of simultaneous processing in the CPU domain. Likewise, the inversion procedure code has been intensively improved by exploiting the performance efficiency of GPU-based multiprocessing implementation (with an automatic fallback to CPU-based multiprocessing in case of GPU hardware absence) and by unifying sub-programs to minimize I/O operations. In addition, a fine-grained 4D (space-time) adjustable source grid search is available for more accurate MT solutions. Moreover, Gisola expands its seismic data input resources by interconnecting to the FDSN Web Services. Furthermore, the new software has the ability to export the results in the well-known QuakeML standard, and in this way, provide clients -such as SeisComP- with MT results attached to the seismic event information. Finally, the operator has full control of all calculation aspects, with an extensive and adapted to regional data, configuration. The program can be installed on any computer that operates a Linux OS and has access to the FDSN Web Services, while the source code will be open and free to the scientific community.</p><p> </p><p>Cotton F. and Coutant O., 1997, Dynamic stress variations due to shear faults in a plane-layered medium, GEOPHYSICAL JOURNAL INTERNATIONAL,Vol 128, 676-688, doi: 10.1111/j.1365-246X.1997.tb05328.x.<br>Sokos, E. N., and J. Zahradník (2008). ISOLA a FORTRAN code and a MATLAB GUI to perform multiple-point source inversion of seismic data, Comp. Geosci. 34, no. 8, 967–977, doi: 10.1016/j.cageo.2007.07.005.<br>Triantafyllis, N., Sokos, E., Ilias, A., & Zahradník, J. (2016). Scisola: automatic moment tensor solution for SeisComP3. Seismological Research Letters, 87(1), 157-163, doi: 10.1785/0220150065.</p>


2020 ◽  
Vol 92 (1) ◽  
pp. 77-84 ◽  
Author(s):  
Jean-Marie Saurel ◽  
Jordane Corbeau ◽  
Sébastien Deroussi ◽  
Tristan Didier ◽  
Arnaud Lemarchand ◽  
...  

Abstract Between 2008 and 2014, the Institut de Physique du Globe de Paris (IPGP) and the University of the West Indies, Seismic Research Centre (UWI-SRC) designed and built a regional seismic network across the Lesser Antilles. One of the goals of the network is to provide real-time seismic data to the tsunami warning centers in the framework of the Intergovernmental Coordination Group working toward the establishment of a tsunami and other coastal hazards early warning system (ICG-CARIBE-EWS) for the Caribbean and adjacent regions (McNamara et al., 2016). In an area prone to hurricanes, earthquakes, tsunamis, and volcanic eruptions, we chose different techniques and technologies to ensure that our cooperated network could survive and keep providing data in case of major natural hazards. The Nanometrics very small aperture terminal (VSAT) technology is at the heart of the system. It allows for duplicated data collection at the three observatories (Trinidad, Martinique, and Guadeloupe; Anglade et al., 2015). In 2017, the network design and implementation were put to the test with Saffir–Simpson category 5 hurricanes Irma and Maria that went, respectively, through the north and central portion of the Lesser Antilles, mainly impacting the sites operated by volcanological and seismological observatories of IPGP in Martinique (Observatoire Volcanologique et Sismologique de Martinique [OVSM]) and in Guadeloupe (Observatoire Volcanologique et Sismologique de Guadeloupe [OVSG]). Our concepts proved to be valid with a major data shortage of less than 12 hr and only two stations having sustained heavy damage. In this article, we review the strengths and weaknesses of the initial design and discuss various steps that can be taken to enhance the ability of our cooperated network to provide timely real-time seismic data to tsunami warning centers under any circumstances.


2020 ◽  
Vol 221 (3) ◽  
pp. 1941-1958 ◽  
Author(s):  
Mariangela Guidarelli ◽  
Peter Klin ◽  
Enrico Priolo

SUMMARY Prompt detection and accurate location of microseismic events are of great importance in seismic monitoring at local scale and become essential steps in monitoring underground activities, such as oil and gas production, geothermal exploitation and underground gas storage, for implementing effective control procedures to limit the induced seismicity hazard. In this study, we describe an automatic and robust earthquake detection and location procedure that exploits high-performance computing and allows the analysis of microseismic events in near real-time using the full waveforms recorded by a local seismic network. The implemented technique, called MigraLoc, is based on the space–time migration of continuous waveform data and consists of the following steps: (1) enhancement of P and S arrivals in noisy signals through a characteristic function, by means of the time–frequency analysis of the seismic records; (2) blind event location based on delay-and-sum approach systematically scanning the volume of potential hypocentres; (3) detection notification according to the information content of the hypocentre probability distribution obtained in the previous step. The technique implies that theoretical arrival times are pre-calculated for each station and all potential hypocentres as a solution of the seismic-ray equation in a given 3-D medium. As a test case, we apply MigraLoc to two, low-magnitude, earthquake swarms recorded by the Collalto Seismic Network in the area of the Veneto Alpine foothills (Italy) in 2014 and 2017, respectively. Thanks to MigraLoc, we can increase the number of events reported in the network catalogue by more than 25 per cent. The automatically determined locations prove to be consistent with, and overall more accurate than, those obtained by classical methods using manual time-arrival picks. The proposed method works preferably with dense networks that provide signals with some degree of coherency. It shows the following advantages compared to other classical location methods: it works on the continuous stream of data as well as on selected intervals of waveforms; it detects more microevents owing to the increased signal-to-noise ratio of the stacked signal that feeds the characteristic function; it works with any complex 3-D model with no additional effort; it is completely automatic, once calibrated, and it does not need any manual picking.


2016 ◽  
Vol 50 (3) ◽  
pp. 87-91 ◽  
Author(s):  
Morifumi Takaesu ◽  
Hiroki Horikawa ◽  
Kentaro Sueki ◽  
Narumi Takahashi ◽  
Akira Sonoda ◽  
...  

AbstractMega-thrust earthquakes are anticipated to occur in the Nankai Trough in Southwest Japan. In order to monitor seismicity, crustal deformations, and tsunamis in earthquake source areas, we deployed the seafloor seismic network DONET (Dense Ocean-floor Network System for Earthquakes and Tsunamis) in 2010 (Kaneda et al., 2015; Kawaguchi et al., 2015). The DONET system consists of a total of 20 stations that are composed of multiple types of sensors, including strong-motion seismometers and quartz pressure gauges. These stations are densely distributed at an average distance of 15‐20 km and cover from near the trench axis to coastal areas. Observed data are transferred to a land station through a fiber-optic cable and then to the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) data management center through a private network in real time.After the 2011 earthquake off the Pacific coast of Tohoku, each local government close to the Nankai Trough sought to devise a disaster prevention scheme. These local governments requested that JAMSTEC disseminate the DONET data along with other research capabilities so that they could exploit this important earthquake information. In order to provide local government access to the DONET data, which are recorded ostensibly for research purposes, we have developed a web application system, REIS (real-time earthquake information system), that provides seismic waveform data to some local governments close to the Nankai Trough. In the present paper, we introduce the specifications of REIS and its system architecture.


2021 ◽  
Vol 1 (1) ◽  
Author(s):  
E. Bertino ◽  
M. R. Jahanshahi ◽  
A. Singla ◽  
R.-T. Wu

AbstractThis paper addresses the problem of efficient and effective data collection and analytics for applications such as civil infrastructure monitoring and emergency management. Such problem requires the development of techniques by which data acquisition devices, such as IoT devices, can: (a) perform local analysis of collected data; and (b) based on the results of such analysis, autonomously decide further data acquisition. The ability to perform local analysis is critical in order to reduce the transmission costs and latency as the results of an analysis are usually smaller in size than the original data. As an example, in case of strict real-time requirements, the analysis results can be transmitted in real-time, whereas the actual collected data can be uploaded later on. The ability to autonomously decide about further data acquisition enhances scalability and reduces the need of real-time human involvement in data acquisition processes, especially in contexts with critical real-time requirements. The paper focuses on deep neural networks and discusses techniques for supporting transfer learning and pruning, so to reduce the times for training the networks and the size of the networks for deployment at IoT devices. We also discuss approaches based on machine learning reinforcement techniques enhancing the autonomy of IoT devices.


Sign in / Sign up

Export Citation Format

Share Document