parkfield earthquake
Recently Published Documents


TOTAL DOCUMENTS

99
(FIVE YEARS 4)

H-INDEX

22
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Ilya Zaliapin ◽  
Yehuda Ben-Zion

<p>We present results aimed at understanding preparation processes of large earthquakes by tracking progressive localization of earthquake deformation with three complementary analyses: (i) estimated production of rock damage by background events, (ii) spatial localization of background seismicity within damaged areas, and (iii) progressive coalescence of individual earthquakes into clusters. Techniques (i) and (ii) employ declustered catalogs to avoid the occasional strong fluctuations associated with aftershock sequences, while technique (iii) examines developing clusters in entire catalog data. The different techniques provide information on different time scales and on the spatial extent of weakened damaged regions. The analyses reveal generation of earthquake-induced rock damage on a decadal timescale around eventual rupture zones, and progressive localization of background seismicity on a 2-3 yr timescale before several M > 7 earthquakes in southern and Baja California and M7.9 events in Alaska. This is followed by coalescence of earthquakes into growing clusters that precede the mainshocks. Corresponding analysis around the 2004 M6 Parkfield earthquake in the creeping section of the San Andreas fault shows contrasting tendencies to those associated with the large seismogenic faults. The results are consistent with observations from laboratory experiments and physics-based models with heterogeneous materials not dominated by a pre-existing failure zone. Continuing studies with these techniques, combined with analysis of geodetic data and insights from laboratory experiments and model simulations, may allow developing an integrated multi-signal procedure to estimate the approaching time and size of large earthquakes.</p>


2020 ◽  
Vol 110 (6) ◽  
pp. 2638-2646
Author(s):  
Asaf Inbal ◽  
Alon Ziv

ABSTRACT Permanent ground offsets, constituting a prime dataset for constraining final fault-slip distributions, may not be recovered straightforwardly by double integration of near-field accelerograms due to tilt and other distorting effects. Clearly, if a way could be found to recover permanent ground offsets from acceleration records, then static datasets would be enlarged, and thus the resolution of fault-slip inversions would be enhanced. Here, we introduce a new approach for extracting permanent offsets from near-field strong-motion accelerograms. The main advantage of the new approach with respect to previous ones is that it corrects for source time functions of any level of complexity. Its main novelty is the addition of a constraint on the slope of the ground velocity spectra at long periods. We validated the new scheme using collocated accelerograms and Global Navigation Satellite Systems (GNSS) records of the 2011 Mw 9 Tohoku-Oki earthquake. We find a good agreement between accelerogram-based and GNSS-based ground offsets over a range of 0.1–5 m. To improve the spatial coverage of permanent ground offsets associated with the 2004 Parkfield earthquake, near-field accelerograms were baseline corrected using the new scheme. Static slip inversion of the combined GNSS-based and accelerogram-based ground displacements indicates appreciable seismic moment release south of the epicenter, about 5 km into the Cholame section of the San Andreas fault. We conclude that the strong shaking observed to the south of the epicenter is directly related to the slip in that area and is not the result of local amplification.


Author(s):  
Molly Luginbuhl ◽  
John B. Rundle ◽  
Donald L. Turcotte

A standard approach to quantifying the seismic hazard is the relative intensity (RI) method. It is assumed that the rate of seismicity is constant in time and the rate of occurrence of small earthquakes is extrapolated to large earthquakes using Gutenberg–Richter scaling. We introduce nowcasting to extend RI forecasting to time-dependent seismicity, for example, during an aftershock sequence. Nowcasting uses ‘natural time’; in seismicity natural time is the event count of small earthquakes. The event count for small earthquakes is extrapolated to larger earthquakes using Gutenberg–Richter scaling. We first review the concepts of natural time and nowcasting and then illustrate seismic nowcasting with three examples. We first consider the aftershock sequence of the 2004 Parkfield earthquake on the San Andreas fault in California. Some earthquakes have higher rates of aftershock activity than other earthquakes of the same magnitude. Our approach allows the determination of the rate in real time during the aftershock sequence. We also consider two examples of induced earthquakes. Large injections of waste water from petroleum extraction have generated high rates of induced seismicity in Oklahoma. The extraction of natural gas from the Groningen gas field in The Netherlands has also generated very damaging earthquakes. In order to reduce the seismic activity, rates of injection and withdrawal have been reduced in these two cases. We show how nowcasting can be used to assess the success of these efforts. This article is part of the theme issue ‘Statistical physics of fracture and earthquakes’.


2018 ◽  
Vol 8 (1) ◽  
Author(s):  
Guillaume Bacques ◽  
Marcello de Michele ◽  
Daniel Raucoules ◽  
Hideo Aochi ◽  
Frédérique Rolandone

2016 ◽  
Vol 43 (2) ◽  
pp. 620-627 ◽  
Author(s):  
Yongxin Gao ◽  
Jerry M. Harris ◽  
Jian Wen ◽  
Yihe Huang ◽  
Cedric Twardzik ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document