earthquake forecasting
Recently Published Documents


TOTAL DOCUMENTS

251
(FIVE YEARS 110)

H-INDEX

20
(FIVE YEARS 3)

2022 ◽  
Author(s):  
Marcus Herrmann ◽  
Ester Piegari ◽  
Warner Marzocchi

Abstract The Magnitude–Frequency-Distribution (MFD) of earthquakes is typically modeled with the (tapered) Gutenberg–Richter relation. The main parameter of this relation, the b-value, controls the relative rate of small and large earthquakes. Resolving spatiotemporal variations of the b-value is critical to understanding the earthquake occurrence process and improving earthquake forecasting. However, this variation is not well understood. Here we present unexpected MFD variability using a high-resolution earthquake catalog of the 2016–2017 central Italy sequence. Isolation of seismicity clusters reveals that the MFD differs in nearby clusters, varies or remains constant in time depending on the cluster, and features an unexpected b-value increase in the cluster where the largest event will occur. These findings suggest a strong influence of the heterogeneity and complexity of tectonic structures on the MFD. Our findings raise the question of the appropriate spatiotemporal scale for resolving the b-value, which poses a serious obstacle to interpreting and using the MFD in earthquake forecasting.


2022 ◽  
Author(s):  
Kirsty Bayliss ◽  
Mark Naylor ◽  
Farnaz Kamranzad ◽  
Ian Main

Abstract. Probabilistic earthquake forecasts estimate the likelihood of future earthquakes within a specified time-space-magnitude window and are important because they inform planning of hazard mitigation activities on different timescales. The spatial component of such forecasts, expressed as seismicity models, generally rely upon some combination of past event locations and underlying factors which might affect spatial intensity, such as strain rate, fault location and slip rate or past seismicity. For the first time, we extend previously reported spatial seismicity models, generated using the open source inlabru package, to time-independent earthquake forecasts using California as a case study. The inlabru approach allows the rapid evaluation of point process models which integrate different spatial datasets. We explore how well various candidate forecasts perform compared to observed activity over three contiguous five year time periods using the same training window for the seismicity data. In each case we compare models constructed from both full and declustered earthquake catalogues. In doing this, we compare the use of synthetic catalogue forecasts to the more widely-used grid-based approach of previous forecast testing experiments. The simulated-catalogue approach uses the full model posteriors to create Bayesian earthquake forecasts. We show that simulated-catalogue based forecasts perform better than the grid-based equivalents due to (a) their ability to capture more uncertainty in the model components and (b) the associated relaxation of the Poisson assumption in testing. We demonstrate that the inlabru models perform well overall over various time periods, and hence that independent data such as fault slip rates can improve forecasting power on the time scales examined. Together, these findings represent a significant improvement in earthquake forecasting is possible, though this has yet to be tested and proven in true prospective mode.


2022 ◽  
Author(s):  
Haoyu Wen ◽  
Hong-Jia Chen ◽  
Chien-Chih Chen ◽  
Massimo Pica Ciamarra ◽  
Siew Ann Cheong

Abstract. Geoelectric time series (TS) has long been studied for its potential for probabilistic earthquake forecasting, and a recent model (GEMSTIP) directly used the skewness and kurtosis of geoelectric TS to provide Time of Increased Probabilities (TIPs) for earthquakes in several months in future. We followed up on this work by applying the Hidden Markov Model (HMM) on the correlation, variance, skewness, and kurtosis TSs to identify two Hidden States (HSs) with different distributions of these statistical indexes. More importantly, we tested whether these HSs could separate time periods into times of higher/lower earthquake probabilities. Using 0.5-Hz geoelectric TS data from 20 stations across Taiwan over 7 years, we first computed the statistical index TSs, and then applied the Baum-Welch Algorithm with multiple random initializations to obtain a well-converged HMM and its HS TS for each station. We then divided the map of Taiwan into a 16-by-16 grid map and quantified the forecasting skill, i.e., how well the HS TS could separate times of higher/lower earthquake probabilities in each cell in terms of a discrimination power measure that we defined. Next, we compare the discrimination power of empirical HS TSs against those of 400 simulated HS TSs, then organized the statistical significance values from these cellular-level hypothesis testing of the forecasting skill obtained into grid maps of discrimination reliability. Having found such significance values to be high for many grid cells for all stations, we proceeded with a statistical hypothesis test of the forecasting skill at the global level, to find high statistical significance across large parts of the hyperparameter spaces of most stations. We therefore concluded that geoelectric TSs indeed contain earthquake-related information, and the HMM approach to be capable at extracting this information for earthquake forecasting.


2021 ◽  
Vol 9 ◽  
Author(s):  
Giovanni Martinelli ◽  
Antonella Peresan ◽  
Ying Li

2021 ◽  
Vol 11 (22) ◽  
pp. 10899
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

Seismicity-based earthquake forecasting models have been primarily studied and developed over the past twenty years. These models mainly rely on seismicity catalogs as their data source and provide forecasts in time, space, and magnitude in a quantifiable manner. In this study, we presented a technique to better determine future earthquakes in space based on spatially smoothed seismicity. The improvement’s main objective is to use foreshock and aftershock events together with their mainshocks. Time-independent earthquake forecast models are often developed using declustered catalogs, where smaller-magnitude events regarding their mainshocks are removed from the catalog. Declustered catalogs are required in the probabilistic seismic hazard analysis (PSHA) to hold the Poisson assumption that the events are independent in time and space. However, as highlighted and presented by many recent studies, removing such events from seismic catalogs may lead to underestimating seismicity rates and, consequently, the final seismic hazard in terms of ground shaking. Our study also demonstrated that considering the complete catalog may improve future earthquakes’ spatial forecast. To do so, we adopted two different smoothed seismicity methods: (1) the fixed smoothing method, which uses spatially uniform smoothing parameters, and (2) the adaptive smoothing method, which relates an individual smoothing distance for each earthquake. The smoothed seismicity models are constructed by using the global earthquake catalog with Mw ≥ 5.5 events. We reported progress on comparing smoothed seismicity models developed by calculating and evaluating the joint log-likelihoods. Our resulting forecast shows a significant information gain concerning both fixed and adaptive smoothing model forecasts. Our findings indicate that complete catalogs are a notable feature for increasing the spatial variation skill of seismicity forecasts.


2021 ◽  
Author(s):  
◽  
Katrina Maureen Jacobs

<p>This study is a quantitative investigation and characterization of earthquake sequences in the Central Volcanic Region (CVR) of New Zealand, and several regions in New Zealand and Southern California. We introduce CURATE, a new declustering algorithm that uses rate as the primary indicator of an earthquake sequence, and we show it has appreciable utility for analyzing seismicity. The algorithm is applied to the CVR and other regions around New Zealand. These regions are also compared with the Southern California earthquake catalogue. There is a variety of behavior within these regions, with areas that experience larger mainshock-aftershock (MS-AS) sequences having distinctly different general sequence parameters than those of more swarm dominated regions. The analysis of the declustered catalog shows that Lake Taupo and at least three other North Island regions have correlated variations in rate over periods of ~5 years. These increases in rate are not due to individual large sequences, but are instead caused by a general increase in earthquake and sequence occurrence. The most obvious increase in rate across the four North Island subsets follows the 1995-1996 magmatic eruption at Ruapehu volcano. The fact that these increases are geographically widespread and occur over years at a time suggests that the variations may reflect changes in the subduction system or a broad tectonic process.  We examine basic sequence parameters of swarms and MS-AS sequences to provide better information for earthquake forecasting models. Like MS-AS sequences, swarm sequences contain a large amount of decay (decreasing rate) throughout their duration. We have tested this decay and found that 89% of MS-AS sequences and 55% of swarm sequences are better fit with an Omori's law decay than a linear rate. This result will be important to future efforts to forecast lower magnitude ranges or swarm prone areas like the CVR.  To look at what types of process may drive individual sequences and may be associated with the rate changes, we examined a series of swarms that occurred to the South of Lake Taupo in 2009. We relocated these earthquakes using double-difference method, hypoDD, to obtain more accurate relative locations and depths. These swarms occur in an area about 20x20 km. They do not show systematic migration between sequences. The last swarm in the series is located in the most resistive area of the Tokaanu geothermal region and had two M =4.4 earthquakes within just four hours of each other. The earthquakes in this swarm have an accelerating rate of occurrence leading up to the first M = 4.4 earthquakes, which migrate upward in depth. The locations of earthquakes following the M = 4.4 event expand away from it at a rate consistent with fluid diffusion.  Our statistical investigation of triggering due to large global (M ≥ 7) and regional earthquakes (M ≥ 6) concludes that more detailed (waveform level) investigation of individual sequences will be necessary to conclusively identify triggering, but sequence catalogs may be useful in identifying potential targets for those investigations. We also analyzed the probability that a series of swarms in the central Southern Alps were triggered by the 2009 Dusky Sound Mw = 7.8 and the 2010 Darfield Mw = 7.1 earthquake. There is less than a one-percent chance that the observed sequences occurred randomly in time. The triggered swarms do not show a significant difference to the swarms occurring in that region at other times in the 1.5-year catalog. Waveform cross-correlation was performed on this central Southern Alps earthquake catalog by a fellow PhD student Carolin Boese, and reveals that individual swarms are often composed of a single waveform family or multiple waveform families in addition to earthquakes that did not show waveform similarities. The existence of earthquakes that do not share waveform similarity in the same swarm (2.5 km radius) as a waveform family indicates that similar waveform groups may be unique in their location, but do not necessarily necessitate a unique trigger or driver. In addition to these triggered swarms in the Southern Alps we have also identified two swarms that are potentially triggered by slow-slip earthquakes along the Hikurangi margin in 2009 and 2010. The sequence catalogs generated by the CURATE method may be an ideal tool for searching for earthquake sequences triggered by slow-slip.</p>


Sign in / Sign up

Export Citation Format

Share Document