scholarly journals Application of an ensemble earthquake rate model in Italy, considering seismic catalogs and fault moment release

2020 ◽  
Vol 63 (6) ◽  
Author(s):  
Maura Murru ◽  
Giuseppe Falcone ◽  
Matteo Taroni ◽  
Rodolfo Console

We develop an ensemble earthquake rate model that provides spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events throughout Italy, for magnitude bin of 0.1 units from Mw ≥ 4.5 in spatial cells of 0.1° × 0.1°. We weighed seismic activity rates of smoothed seismicity and fault-based inputs to build our earthquake rupture forecast model, merging it into a single ensemble model. Both inputs adopt a tapered Gutenberg-Richter relation with a single b-value and a single corner magnitude estimated by earthquakes catalog. The spatial smoothed seismicity was obtained using the classical kernel smoothing method with the inclusion of magnitude dependent completeness periods applied to the Historical (CPTI15) and Instrumental seismic catalogs. For each seismogenic source provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate of the events above Mw 4.5, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the tapered Gutenberg-Richter relation with the same parameters of the smoothed seismicity models. Comparing seismic annual rates of the catalogs with those of the seismogenic sources, we realized that there is a good agreement between these rates in Central Apennines zones, whereas the seismogenic rates are higher than those of the catalogs in the north east and south of Italy. We also tested our model against the strong Italian earthquakes (Mw 5.5+), in order to check if the total number (N-test) and the spatial distribution (S-test) of these events was compatible with our model, obtaining good results, i.e. high p-values in the test. The final model will be a branch of the new Italian seismic hazard map.  

Proceedings ◽  
2019 ◽  
Vol 24 (1) ◽  
pp. 7
Author(s):  
Sandeep Kumar Aggarwal

Talala is an excellent example of triggered neo-tectonic seismicity between two dams during a monsoon. An earthquake of Mmax 5.1 on 6 November 2007 at 21.16° N; 70.54° E, with a focal depth of 4.5 km and complete sequence, was first-time recorded on the latest broadband sensor. This found a dam/monsoon-induced earthquake preceded by 18 foreshocks of 2 ≤ Mw ≤ 4.8 within 9 h 11 minute, as well as smaller shocks that may not have been recorded because of sparse network coverage. After the deployment of local mobile observatories, aftershocks of Mw ≥ 1.0, which continued for months and subsided to background seismicity after four months, were recorded. The same kind of phenomena repeated, with Mmax 5.0 on 20 October 2011 at 21.06° N; 70.50° E, focal depth 5.5 km, which implies that the potential to generate dam/monsoon-induced seismicity took nearly four years again. These phenomena continued and the sequence was recorded by a network of 10 broadband seismographs (three in the Talala area and seven at an epicentral distance of 30 to 300 km). Centroid Moment Tensor (CMT) solutions and spectral source parameters of mainshock and aftershocks are evaluated to understand the seismotectonic of the region. The CMT depicts a major strike-slip motion along East North East-West South West with a left-lateral plane at 4.5 km depth. This indicates a sympathetic fault extension of the Son-Narmada fault. The source parameters of 400 shocks of Mw 1.0 to 5.1 found seismic moment 1011 to 1016.5 N-m, source radii 120–850 meter, and a stress drop of 0.003 to 25.43 Mpa. The b-value, p-value, fractal dimension, and slip on estimated different faults. The comparison between Talala and Koyna dam-induced source parameters tries to establish a comparison of seismicity from different parts of the world.


2003 ◽  
Vol 3 (1/2) ◽  
pp. 129-134 ◽  
Author(s):  
T. M. Tsapanos ◽  
G. A. Papadopoulos ◽  
O. Ch. Galanis

Abstract. A Bayesian statistics approach is applied in the seismogenic sources of Greece and the surrounding area in order to assess seismic hazard, assuming that the earthquake occurrence follows the Poisson process. The Bayesian approach applied supplies the probability that a certain cut-off magnitude of Ms = 6.0 will be exceeded in time intervals of 10, 20 and 75 years. We also produced graphs which present the different seismic hazard in the seismogenic sources examined in terms of varying probability which is useful for engineering and civil protection purposes, allowing the designation of priority sources for earthquake-resistant design. It is shown that within the above time intervals the seismogenic source (4) called Igoumenitsa (in NW Greece and west Albania) has the highest probability to experience an earthquake with magnitude M > 6.0. High probabilities are found also for Ochrida (source 22), Samos (source 53) and Chios (source 56).


Author(s):  
M. I. Borrajo ◽  
C. Comas ◽  
S. Costafreda-Aumedes ◽  
J. Mateu

AbstractWildlife-vehicle collisions on road networks represent a natural problem between human populations and the environment, that affects wildlife management and raise a risk to the life and safety of car drivers. We propose a statistically principled method for kernel smoothing of point pattern data on a linear network when the first-order intensity depends on covariates. In particular, we present a consistent kernel estimator for the first-order intensity function that uses a convenient relationship between the intensity and the density of events location over the network, which also exploits the theoretical relationship between the original point process on the network and its transformed process through the covariate. We derive the asymptotic bias and variance of the estimator, and adapt some data-driven bandwidth selectors to estimate the optimal bandwidth. The performance of the estimator is analysed through a simulation study under inhomogeneous scenarios. We present a real data analysis on wildlife-vehicle collisions in a region of North-East of Spain.


Author(s):  
Jacob Deasy ◽  
Emma Rocheteau ◽  
Katharina Kohler ◽  
Daniel J. Stubbs ◽  
Pietro Barbiero ◽  
...  

AbstractThe COVID-19 pandemic has led to unprecedented strain on intensive care unit (ICU) admission in parts of the world. Strategies to create surge ICU capacity require complex local and national service reconfiguration and reduction or cancellation of elective activity. These measures have an inevitable lag-time before additional capacity comes on-line. An accurate short-range forecast would be helpful in guiding such difficult, costly, and ethically challenging decisions.At the time this work began, cases in England were starting to increase. If this represents a true spread in disease then ICU demand could increase rapidly. Here we present a short-range forecast based on published real-time COVID-19 case data from the seven National Health Service (NHS) commissioning regions in England (East of England, London, Midlands, North East and Yorkshire, North West, South East and South West). We use a Monte Carlo approach to model the likely impact of current diagnoses on regional ICU capacity over a 14-day horizon under the assumption that the increase in cases represents the start of an exponential growth in infections. Our model is designed to be parsimonious and based on available epidemiological data from the literature at the moment.On the basis of the modelling assumptions made, ICU occupancy is likely to increase dramatically in the days following the time of modelling. If the current exponential growth continues, case numbers will be comparable to current ICU bed numbers within weeks. Despite variable growth in absolute patients, all commissioning regions are forecast to be heavily burdened under the assumptions used.Whilst, like any forecast model, there remain uncertainties both in terms of model specification and robust epidemiological data in this early prospective phase, it would seem that surge capacity will be required in the very near future. Our findings should be interpreted with caution, but we hope that our model will help policy decision makers with their preparations. The uncertainties in the data highlight the urgent need for ongoing real-time surveillance to allow forecasts to be constantly updated using high quality local patient-facing data as it emerges.


2020 ◽  
Author(s):  
Gianluca Valensise ◽  
Roberto Basili ◽  
Pierfrancesco Burrato ◽  
Umberto Fracassi ◽  
Vanja Kastelic ◽  
...  

<p>The prototype version of the DISS was launched and published in July 2000. Twenty years later we present an appraisal of how the database started off, how it evolved, and how it served the seismological and engineering communities.</p><p>During the early years of its development we learned that the three fundamental requirements of any SHA-oriented fault database are:</p><p>1) the capacity to represent seismogenic sources in 3D, thus providing a standardized quantitative basis for subsequent SHA calculations and stressing the hierarchy relationships among all existing active faults;</p><p>2) the completeness, i.e. the ability to portray the vast majority of seismogenic sources existing in the region of relevance and to progressively address the emerging lack of knowledge;</p><p>3) the reliability of the geometrical parameters of each seismogenic source and of the relevant slip and strain rates, and the ability to assess the associated uncertainties.</p><p>Given these requirements, we found it hard to build a database around existing studies of individual large faults, which are often carried out for non-SHA purposes; as such they do not necessarily involve a 3D delineation and a hierarchization of the master fault. Furthermore, most published studies concern surface-breaking faults occurring onshore; they are most relevant to surface faulting hazard, but in shaking-oriented SHA they are less crucial than deeper, hidden faults.</p><p>We initially developed the concept of “Individual Seismogenic Source” (ISS), a simplified but geometrically coherent representation of the presumed causative fault of the largest earthquakes of the investigated region. An ISS is based on original observations, seismological/geophysical evidence, and literature data. Since large portions of the Italian territory are characterized by blind or hidden faulting, we developed strategies based on the analysis of geomorphic evidence for cumulative tectonic strain, on the reappraisal of commercial seismic lines and subsurface data, and on geological and geodetic evidence.</p><p>In 2005 we introduced the “Composite Seismogenic Sources” (CSSs): generalized, unsegmented sources designed to increase the database geographic coverage and completeness, based on the same type of information used for the ISSs and on regional-scale synopses of ongoing tectonic strain. Their identification was progressively extended to offshore areas, often scarcely considered in traditional fault mapping. In 2015 we also introduced the 3D definition of the subduction slabs and associated interfaces for the whole Mediterranean region.</p><p>The ISSs are routinely used in engineering applications aimed at investigating the shaking scenario associated with known earthquakes or well-identified quiescent fault segments. In contrast, the CSSs are not assumed to be capable of a specific-size earthquake; as such, they can be used in any standard PSHA procedure after estimating their activity rate and frequency magnitude distribution, based on tectonic slip rates integrated with the record of past earthquakes and GPS-determined strains, or derived from regional-scale geodynamic models.</p><p>DISS also served as a template for developing EDSF,  the European Database of Seismogenic Faults. Over the years, DISS and EDSF have become the basic geological input for PSHA and PTHA, both at Italian scale (MPS04, MPS19, MPTS19) and European scale (ESHM13, ESHM20, NEAMTHM18).</p>


2021 ◽  
Author(s):  
Jack N. Williams ◽  
Luke N. J. Wedmore ◽  
Åke Fagereng ◽  
Maximilian J. Werner ◽  
Hassan Mdala ◽  
...  

Abstract. Active fault data are commonly used in seismic hazard assessments, but there are challenges in deriving the slip rate, geometry, and frequency of earthquakes along active faults. Herein, we present the open-access geospatial Malawi Seismogenic Source Database (MSSD), which describes the seismogenic properties of faults that have formed during East African rifting in Malawi. We first use empirical observations to geometrically classify active faults into section, fault, and multi-fault seismogenic sources. For sources in the North Basin of Lake Malawi, slip rates can be derived from the vertical offset of a seismic reflector that is estimated to be 75 ka based on dated core. Elsewhere, slip rates are constrained from advancing a ‘systems-based’ approach that partitions geodetically-derived rift extension rates in Malawi between seismogenic sources using a priori constraints on regional strain distribution in magma-poor continental rifts. Slip rates are then combined with source geometry and empirical scaling relationships to estimate earthquake magnitudes and recurrence intervals, and their uncertainty is described from the variability of outcomes from a logic tree used in these calculations. We find that for sources in the Lake Malawi’s North Basin, where slip rates can be derived from both the geodetic data and the offset seismic reflector, the slip rate estimates are within error of each other, although those from the offset reflector are higher. Sources in the MSSD are 5–200 km long, which implies that large magnitude (MW 7–8) earthquakes may occur in Malawi. Low slip rates (0.05–2 mm/yr), however, mean that the frequency of such events will be low (recurrence intervals ~103–104 years). The MSSD represents an important resource for investigating Malawi’s increasing seismic risks and provides a framework for incorporating active fault data into seismic hazard assessment in other tectonically active regions.


2015 ◽  
Vol 105 (5) ◽  
pp. 2538-2554 ◽  
Author(s):  
P. Bird ◽  
D. D. Jackson ◽  
Y. Y. Kagan ◽  
C. Kreemer ◽  
R. S. Stein

Vaccines ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 728
Author(s):  
Tareq Hussein ◽  
Mahmoud H. Hammad ◽  
Pak Lun Fung ◽  
Marwan Al-Kloub ◽  
Issam Odeh ◽  
...  

In this study, we proposed three simple approaches to forecast COVID-19 reported cases in a Middle Eastern society (Jordan). The first approach was a short-term forecast (STF) model based on a linear forecast model using the previous days as a learning data-base for forecasting. The second approach was a long-term forecast (LTF) model based on a mathematical formula that best described the current pandemic situation in Jordan. Both approaches can be seen as complementary: the STF can cope with sudden daily changes in the pandemic whereas the LTF can be utilized to predict the upcoming waves’ occurrence and strength. As such, the third approach was a hybrid forecast (HF) model merging both the STF and the LTF models. The HF was shown to be an efficient forecast model with excellent accuracy. It is evident that the decision to enforce the curfew at an early stage followed by the planned lockdown has been effective in eliminating a serious wave in April 2020. Vaccination has been effective in combating COVID-19 by reducing infection rates. Based on the forecasting results, there is some possibility that Jordan may face a third wave of the pandemic during the Summer of 2021.


2020 ◽  
Vol 224 (2) ◽  
pp. 1174-1187
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

SUMMARY The classical procedure of the probabilistic seismic hazard analysis (PSHA) requires a Poissonian distribution of earthquakes. Seismic catalogues follow a Poisson distribution just after the application of a declustering algorithm that leaves only one earthquake for each seismic sequence (usually the stronger, i.e. the main shock). Removing earthquakes from the seismic catalogues leads to underestimation of the annual rates of the events and consequently associate with low seismic hazard as indicated by several studies. In this study, we aim investigating the performance of two declustering methods on the Italian instrumental catalogue and the impact of declustering on estimation of the b-value and on the seismic hazard analysis. To this end, first the spatial variation in the seismicity rate was estimated from the declustered catalogues using the adaptive smoothed seismicity approach, considering small earthquakes (Mw ≥ 3.0). We then corrected the seismicity rates using new approach that allows for counting all events in the complete seismic catalogue by simply changing the magnitude frequency distribution. The impact of declustering on seismic hazard analysis is illustrated using PSHA maps in terms of peak ground acceleration and spectral acceleration in 2 s, with 10 per cent and 2 per cent probability of exceedance in 50 yr, for Italy. We observed that the hazard calculated from the declustered catalogues was always lower than the hazard computed using the complete catalogue. These results are in agreement with previous results obtained in different parts of the world.


Sign in / Sign up

Export Citation Format

Share Document