Attenuation of Intensity for the Zemmouri Earthquake of 21 May 2003 (Mw 6.8): Insights for the Seismic Hazard and Historical Earthquake Sources in Northern Algeria

Author(s):  
S. Maouche ◽  
A. Harbi ◽  
M. Meghraoui
2020 ◽  
Vol 91 (3) ◽  
pp. 1531-1541
Author(s):  
Paul G. Richards ◽  
Margaret Hellweg

Abstract Quantitative seismology is based firmly on the analysis of actual ground motions, and the transition to digital recording in the 1980s enabled sophisticated new capabilities to extract useful results from waveforms. With some effort, these tools can also be applied to analog records. Focusing on assets available within U.S. institutions, we review the necessary steps and the challenges in enabling “data rescue”—that is, preserving the scientific information latent in large analog seismogram archives and making it usable. They include: determining what assets are available (the analog seismogram archives held by various institutions, with associated metadata on instrument responses, station locations, and timing information); developing a consensus on the top level of a triage process (which analog records most definitely should be rescued?); deciding the level of quality needed in copying original seismograms to media suitable for digitizing; assessing the relative merits of scanning and digitizing; and, the need for a community service in distributing scans and digital records, as they accumulate. The necessary level of effort can benefit from practical experience. For example, specific studies have used digitized versions of analog recordings to model earthquake sources and assess seismic hazard. Other studies have used them to gain experience with nuclear explosion signals recorded at regional distances, noting that regional signals enable explosions to be monitored down to levels much lower than those attainable teleseismically. The opportunities presented by large archives of analog seismograms include the insights they present to current and future seismologists studying earthquakes and explosions, into the practical areas of assessing seismic hazard, monitoring for test ban compliance down to low explosion yields—and prompt characterization of actual explosions should they occur, as well the traditional academic pursuit of a better understanding of earthquake physics.


Author(s):  
Mohamed Hamdache ◽  
José A. Peláez ◽  
AbdelKarim Yelles-Chaouche ◽  
Ricardo Monteiro ◽  
Mario Marques ◽  
...  

2017 ◽  
Vol 17 (11) ◽  
pp. 2017-2039 ◽  
Author(s):  
Alessandro Valentini ◽  
Francesco Visini ◽  
Bruno Pace

Abstract. Italy is one of the most seismically active countries in Europe. Moderate to strong earthquakes, with magnitudes of up to ∼ 7, have been historically recorded for many active faults. Currently, probabilistic seismic hazard assessments in Italy are mainly based on area source models, in which seismicity is modelled using a number of seismotectonic zones and the occurrence of earthquakes is assumed uniform. However, in the past decade, efforts have increasingly been directed towards using fault sources in seismic hazard models to obtain more detailed and potentially more realistic patterns of ground motion. In our model, we used two categories of earthquake sources. The first involves active faults, and using geological slip rates to quantify the seismic activity rate. We produced an inventory of all fault sources with details of their geometric, kinematic, and energetic properties. The associated parameters were used to compute the total seismic moment rate of each fault. We evaluated the magnitude–frequency distribution (MFD) of each fault source using two models: a characteristic Gaussian model centred at the maximum magnitude and a truncated Gutenberg–Richter model. The second earthquake source category involves grid-point seismicity, with a fixed-radius smoothed approach and a historical catalogue were used to evaluate seismic activity. Under the assumption that deformation is concentrated along faults, we combined the MFD derived from the geometry and slip rates of active faults with the MFD from the spatially smoothed earthquake sources and assumed that the smoothed seismic activity in the vicinity of an active fault gradually decreases by a fault-size-driven factor. Additionally, we computed horizontal peak ground acceleration (PGA) maps for return periods of 475 and 2475 years. Although the ranges and gross spatial distributions of the expected accelerations obtained here are comparable to those obtained through methods involving seismic catalogues and classical zonation models, the spatial pattern of the hazard maps obtained with our model is far more detailed. Our model is characterized by areas that are more hazardous and that correspond to mapped active faults, while previous models yield expected accelerations that are almost uniformly distributed across large regions. In addition, we conducted sensitivity tests to determine the impact on the hazard results of the earthquake rates derived from two MFD models for faults and to determine the relative contributions of faults versus distributed seismic activity. We believe that our model represents advancements in terms of the input data (quantity and quality) and methodology used in the field of fault-based regional seismic hazard modelling in Italy.


2012 ◽  
Vol 39 (19) ◽  
pp. n/a-n/a ◽  
Author(s):  
Luca Malagnini ◽  
Robert B. Herrmann ◽  
Irene Munafò ◽  
Mauro Buttinelli ◽  
Mario Anselmi ◽  
...  

1996 ◽  
Vol 86 (4) ◽  
pp. 1019-1027 ◽  
Author(s):  
Livio Sirovich

Abstract Probabilistic calculation of regional seismic hazard maps also requires the use of the so-called “attenuation relations,” which give the reference “shake-ability” at certain distances from the earthquake sources. This article achieves progress in this area. In fact, the present tests on a series of earthquakes in California (San Fernando, 1971; Whittier Narrows, 1987; Northridge, 1994) suggest that in some regions the areal shapes of the territories damaged by past earthquakes may be synthetically traced—sometimes amazingly well—with a simple algorithm that considers some gross features of the sources, and this is compatible with theory. It seems that this algorithm gives rather stable results. Moreover, when the detailed modeling techniques available nowadays are inapplicable due to lack of data, or for purpose of saving time and money, it might be useable for improving seismic hazard calculations and, conversely, for retrieving information about sources of earthquakes from the preinstrumental era.


2006 ◽  
Vol 163 (1) ◽  
pp. 119-135 ◽  
Author(s):  
José A. Peláez ◽  
M. Hamdache ◽  
Carlos López Casado

2004 ◽  
Vol 8 (1) ◽  
pp. 1-10 ◽  
Author(s):  
M.S. Boughacha ◽  
M. Ouyed ◽  
A. Ayadi ◽  
H. Benhallou

Sign in / Sign up

Export Citation Format

Share Document