seismic catalogs
Recently Published Documents


TOTAL DOCUMENTS

22
(FIVE YEARS 4)

H-INDEX

4
(FIVE YEARS 0)

Author(s):  
Laura Gulia ◽  
Paolo Gasperini

Abstract Artifacts often affect seismic catalogs. Among them, the presence of man-made contaminations such as quarry blasts and explosions is a well-known problem. Using a contaminated dataset reduces the statistical significance of results and can lead to erroneous conclusions, thus the removal of such nonnatural events should be the first step for a data analyst. Blasts misclassified as natural earthquakes, indeed, may artificially alter the seismicity rates and then the b-value of the Gutenberg and Richter relationship, an essential ingredient of several forecasting models. At present, datasets collect useful information beyond the parameters to locate the earthquakes in space and time, allowing the users to discriminate between natural and nonnatural events. However, selecting them from webservices queries is neither easy nor clear, and part of such supplementary but fundamental information can be lost during downloading. As a consequence, most of statistical seismologists ignore the presence in seismic catalog of explosions and quarry blasts and assume that they were not located by seismic networks or in case they were eliminated. We here show the example of the Italian Seismological Instrumental and Parametric Database. What happens when artificial seismicity is mixed with natural one?



2021 ◽  
Vol 9 ◽  
Author(s):  
Mauricio González ◽  
José A. Álvarez-Gómez ◽  
Íñigo Aniel-Quiroga ◽  
Luis Otero ◽  
Maitane Olabarrieta ◽  
...  

Tsunami hazard can be analyzed from both deterministic and probabilistic points of view. The deterministic approach is based on a “credible” worst case tsunami, which is often selected from historical events in the region of study. Within the probabilistic approach (PTHA, Probabilistic Tsunami Hazard Analysis), statistical analysis can be carried out in particular regions where historical records of tsunami heights and runup are available. In areas where these historical records are scarce, synthetic series of events are usually generated using Monte Carlo approaches. Commonly, the sea level variation and the currents forced by the tidal motion are either disregarded or considered and treated as aleatory uncertainties in the numerical models. However, in zones with a macro and meso tidal regime, the effect of the tides on the probability distribution of tsunami hazard can be highly important. In this work, we present a PTHA methodology based on the generation of synthetic seismic catalogs and the incorporation of the sea level variation into a Monte Carlo simulation. We applied this methodology to the Bay of Cádiz area in Spain, a zone that was greatly damaged by the 1755 earthquake and tsunami. We build a database of tsunami numerical simulations for different variables: faults, earthquake magnitudes, epicenter locations and sea levels. From this database we generate a set of scenarios from the synthetic seismic catalogs and tidal conditions based on the probabilistic distribution of the involved variables. These scenarios cover the entire range of possible tsunami events in the synthetic catalog (earthquakes and sea levels). Each tsunami scenario is propagated using the tsunami numerical model C3, from the source region to the target coast (Cádiz Bay). Finally, we map the maximum values for a given probability of the selected variables (tsunami intensity measures) producing a set of thematic hazard maps. 1000 different time series of combined tsunamigenic earthquakes and tidal levels were synthetically generated using the Monte Carlo technique. Each time series had a 10000-year duration. The tsunami characteristics were statistically analyzed to derive different thematic maps for the return periods of 500, 1000, 5000, and 10000 years, including the maximum wave elevation, the maximum current speed, the maximum Froude number, and the maximum total forces.



2021 ◽  
Author(s):  
Simone Mancini ◽  
Margarita Segou ◽  
Maximilian J. Werner

<p>Artificial intelligence methods are revolutionizing modern seismology by offering unprecedentedly rich seismic catalogs. Recent developments in short-term aftershock forecasting show that Coulomb rate-and-state (CRS) models hold the potential to achieve operational skills comparable to standard statistical Epidemic-Type Aftershock Sequence (ETAS) models, but only when the near real-time data quality allows to incorporate a more detailed representation of sources and receiver fault populations. In this framework, the high-resolution reconstructions of the seismicity patterns introduced by machine-learning-derived earthquake catalogs represent a unique opportunity to test whether they can be exploited to improve the predictive power of aftershock forecasts.</p><p>Here, we present a retrospective forecast experiment on the first year of the 2016-2017 Central Italy seismic cascade, where seven M5.4+ earthquakes occurred between a few hours and five months after the initial Mw 6.0 event, migrating over a 60-km long normal fault system. As target dataset, we employ the best available high-density machine learning catalog recently released for the sequence, which reports ~1 million events in total (~22,000 with M ≥ 2).</p><p>First, we develop a CRS model featuring (1) rate-and-state variables optimized on 30 years of pre-sequence regional seismicity, (2) finite fault slip models for the seven mainshocks of the sequence, (3) spatially heterogeneous receivers informed by pre-existing faults, and (4) updating receiver fault populations using focal planes gradually revealed by aftershocks. We then test the effect of considering stress perturbations from the M2+ events. Using the same high-precision catalog, we produce a standard ETAS model to benchmark the stress-based counterparts. All models are developed on a 3D spatial grid with 2 km spacing; they are updated daily and seek to forecast the space-time occurrence of M2+ seismicity for a total forecast horizon of one year. We formally rank the forecasts with the statistical scoring metrics introduced by the Collaboratory for the Study of Earthquake Predictability and compare their performance to a generation of CRS and ETAS models previously published for the same sequence by Mancini et al. (2019), who used solely real-time data and a minimum triggering magnitude of M=3.</p><p>We find that considering secondary triggering effects from events down to M=2 slightly improves model performance. While this result highlights the importance of better seismic catalogs to model local triggering mechanisms, it also suggests that to appreciate their full potential future modelling efforts will likely have to incorporate also fine-scale rupture characterizations (e.g., smaller source fault geometries retrieved from enhanced focal mechanism catalogs) and introduce denser spatial model discretizations.</p>



2021 ◽  
Vol 10 (02) ◽  
pp. 75-93
Author(s):  
Eddy Ferdinand Mbossi ◽  
Delair Dieudonné Etoundi Ndibi ◽  
Pauline Wokwenmendam Nguet ◽  
Jean Marcel Abate Essi ◽  
Edouard Olivier Biboum Ntomb ◽  
...  


2020 ◽  
Vol 63 (6) ◽  
Author(s):  
Maura Murru ◽  
Giuseppe Falcone ◽  
Matteo Taroni ◽  
Rodolfo Console

We develop an ensemble earthquake rate model that provides spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events throughout Italy, for magnitude bin of 0.1 units from Mw ≥ 4.5 in spatial cells of 0.1° × 0.1°. We weighed seismic activity rates of smoothed seismicity and fault-based inputs to build our earthquake rupture forecast model, merging it into a single ensemble model. Both inputs adopt a tapered Gutenberg-Richter relation with a single b-value and a single corner magnitude estimated by earthquakes catalog. The spatial smoothed seismicity was obtained using the classical kernel smoothing method with the inclusion of magnitude dependent completeness periods applied to the Historical (CPTI15) and Instrumental seismic catalogs. For each seismogenic source provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate of the events above Mw 4.5, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the tapered Gutenberg-Richter relation with the same parameters of the smoothed seismicity models. Comparing seismic annual rates of the catalogs with those of the seismogenic sources, we realized that there is a good agreement between these rates in Central Apennines zones, whereas the seismogenic rates are higher than those of the catalogs in the north east and south of Italy. We also tested our model against the strong Italian earthquakes (Mw 5.5+), in order to check if the total number (N-test) and the spatial distribution (S-test) of these events was compatible with our model, obtaining good results, i.e. high p-values in the test. The final model will be a branch of the new Italian seismic hazard map.  



Author(s):  
Carlo Meletti ◽  
Romano Camassi ◽  
Viviana Castelli

Abstract In popular opinion, Sardinia is the only nonseismic region of Italy. Most researchers are likely to agree, up to a point. Geology-wise, the Sardinia–Corsica block is among the stablest areas of the Mediterranean. History-wise, up to 2011, only one Mw 5.1 event located offshore Sardinia was listed by Italian seismic catalogs (13 November 1948). Seismic networks record only a few, low energy (Mw<5) events, mostly located offshore and with little or no effects on land. Seismic hazard in Sardinia is very low. “Low,” yes, but not “totally lacking.” We present the results of a recent reappraisal of Sardinian seismicity. We gathered information on three major earthquakes (1616, 1771, and the 1948–1949 sequence). Another sequence (January–March 1901) was re-evaluated, identifying its previously unknown main event. It was confirmed that some earthquakes (1870, 1906, 1922, and 1924) had low magnitudes and scarce to nil macroseismic effects, whereas some other turned out either very doubtful or wholly fictitious (1835, 1838, 1855, and 1898). The seismic hazard of Sardinia can now be reassessed on a sounder basis than before. We hope that our work will help the people of Sardinia to improve their awareness of living in a seismic land, if with a low level of seismicity.



2020 ◽  
Vol 92 (1) ◽  
pp. 508-516
Author(s):  
Matteo Taroni ◽  
Jacopo Selva

Abstract The estimation of the earthquake size distribution parameters is one of the most important parts in any seismic hazard study. GR_EST toolbox is a source code written for OCTAVE/MATLAB (Eaton et al., 2019; MATLAB, 2019) that allows estimating these parameters in a proper way, including the estimation of the associated uncertainties. The toolbox contains functions to make the parameter estimation both for instrumental and historical seismic catalogs, also considering time-varying completeness for magnitudes. Different functional forms for the magnitude–frequency distribution and different strategies for the estimation of its parameters and relative uncertainty are included. To guide the seismologists into the use of this toolbox, a set of complete examples is provided, to be used as “how to” use cases.



2020 ◽  
Vol 91 (6) ◽  
pp. 3585-3594
Author(s):  
Manuel Álvarez-Martí-Aguilar

Abstract In this article, the original accounts of the first 19 earthquakes—occurring before A.D. 881—recorded in Martínez-Solares and Mezcua’s Catálogo sísmico de la Península Ibérica (Martínez-Solares and Mezcua, 2002) are reviewed. Their evolution is traced through references to them in the works of Spanish and Portuguese historians and authors published between the sixteenth and nineteenth centuries, and it is shown how they subsequently made their way into the main Spanish and Portuguese seismic compilations and catalogs. By identifying the first references to news of historical earthquakes in the Iberian Peninsula in the literary sources, the intention is to gain a better understanding of the context in which this information originated over time and to verify its historicity with greater precision. The review performed here shows that the majority of these accounts lack a firm historical basis.



Symmetry ◽  
2020 ◽  
Vol 12 (5) ◽  
pp. 778 ◽  
Author(s):  
Xuan He ◽  
Luyang Wang ◽  
Zheng Liu ◽  
Yiwen Liu

Seismic activities show a space-time symmetry in some research. They have been recently studied using complex network theory. Earthquake network similarity is studied by us from seismic catalogs in the same region for a given period of time. In this paper, we first calculate the distance between feature vectors which represent the topological properties of different networks. A hierarchical clustering of earthquake networks in the same region is shown by using this method. It is found that similar networks are not the networks of adjacent years but those with decades time difference. To study the period of similar earthquake networks in the same region, we use wavelet analysis to obtain the possible periods at different time scales of the regions of the world, California and Japan. It is found that some of the possible periods are consistent with the results which have been already found by seismologists. The study of similar seismic activities from the perspective of the complex network will help seismologists to study the law of earthquake occurrence in a new way, which may provide possible research thinking for earthquake prediction.



2020 ◽  
Author(s):  
Josipa Majstorović ◽  
Piero Poli

<p>The machine learning (ML) algorithms have already found their application in standard seismological procedures, such as earthquake detection and localization, phase picking, earthquake early warning system, etc. They are progressively becoming superior methods since one can rapidly scan voluminous data and detect earthquakes, even if buried in highly noisy time series.</p><p>We here make use of ML algorithms to obtain more complete near fault seismic catalogs and thus better understand the long-term (decades) evolution of seismicity before large earthquakes occurrence. We focus on data recorded before the devastating L’Aquila earthquake (6 April 2009 01:32 UTC, Mw6.3) right beneath the city of L’Aquila in the Abruzzo region (Central Italy). Before this event sparse stations were available, reducing the magnitude completeness of standard catalogs. </p><p>We adapted existing convolutional neural networks (CNN) for earthquake detection, localisation and characterization using a single-station waveforms. The CNN is applied to 29 years of data (1990 to 2019) recorded at the AQU station, located near the city of L’Aquila (Italy). The pre-existing catalog maintained by Istituto nazionale di geofisica e vulcanologia is used to define labels and train and test the CNN. We are here interested in classifying the continuous three-component waveforms into four categories, noise/earthquakes, distance (location), magnitude and depth, where each category consists of several nodes. Existing seismic catalogs are used to label earthquakes, while the noise events are randomly selected between the catalog events, evenly represented by daytime and night-time periods.</p><p>We prefer CNN over other methods, since we can use seismograms directly with very minor pre-processing (e.g. filtering) and we do not need any prior knowledge of the region.</p><p><br><br></p>



Sign in / Sign up

Export Citation Format

Share Document